But the usual way of solving this problem is to set up a mail queue, and then cron it up or make it a daemon to process it. That way you're not relying on your database sending emails. Having an already heavily taxed database server send emails is probably not the best use of resources for most people.
Make a table that has all the things to be sent. Write a trigger that puts entries in for each email the user "sends". Have your cron job pick it up every minute or two and email it for you, on another machine. Lock the records and have the script check to see if it's already running and terminate.
If there is a problem sending emails it shouldn't DOS the database server with filled hard drives, stalled processes, etc. If you really wanna have the database trigger the email, then have an internal server that answers to your database and send your mail in queued mode. Then they go sit on another server that happily takes email from the db server and sits it in a queue.
So, what are the requirements here in terms of timeliness and load and numbers? If you need to send a million copies of a newsletter at night, off peak hours, you use the multiple machines sending emails with a list from the db. Add machines until connection is saturated for more performance.
If you need to send more timely emails, like registration emails, then you probably do it some other way. I'd have and internal email server and the app send the email myself. Triggered table would be nice for logging what we sent or couldn't send. Here even cascading updatable fks would work fine. Since those are implemented as triggers in postgresql, I tend to think trigger whether it's a FK-PK relationship, which is way easier to implement for most people than writing a trigger. So, if you can make with work with a cascading FK relationship I'd do that too.
So, in a nutshell, OP, what are you trying to accomplish?