I still don't have enough information to form any recommendations, it seems. I'd have to see what's going on in these methods that are being called.
Why do you need to split keywords? Is this some effort to recreate a fulltext index? I don't know much about pgsql, but if I were you I'd look into built-in tools for database optimization. Typically this will involve creating indexes on your database tables, which facilitate more efficient retrieval of data.
Some time with Google ought to help you with this one.
You can form some stats for yourself, time the execution of each function to narrow it down. You can do that with a microtime function.
list($usec, $sec) = explode(' ', microtime());
return ((float)$usec + (float)$sec);
Then to do a benchmark, place the following where you want to begin the benchmark...
$time_start = microtime_float();
...and the following where you want to end the benchmark...
echo round((microtime_float() - $time_start), 2);
This will give you the time of execution in seconds. If you have anything over a second, your program is taking entirely too long to execute. You can use a technique like this to narrow down the bottleneck to a particular snippet of code. Most web-based applications do their business in a fraction of a second, and if you're spending that much memory, I'd venture to say that time is also an issue.
It really sounds like you need fulltext indexing on your database tables which would eliminate the need for spliting up keywords and that sort of thing, but I'm just speculating, as again, I'd need to see what those functions are doing to know for sure.
The other thing you can do is implement garbage collection. After everything you do in your script, be sure to clean up after yourself. unset() variables after you're finished with them. Close connections to outside services. Free result sets after you're done with a query (as mentioned in my previous post).
Beginning CSS: Cascading Style Sheets For Web Design
CSS Instant Results