Connect to Multiple Databases with Zend Framework

Published: 05/31/2011

Brain Dump, Programming, Code

I recently had a project where there would be 2 databases being used together; one locally and the other hosted externally by the client. Using traditional PHP this wouldn’t be a problem (just pass a link to the database I want to use on a per query basis and I’m done), but, since this project was to use the Zend Framework and I’d never attempted this sort of thing before with it, I was in quite the pickle.

Doing some quick Googling turned up an article that seemed to fit the bill: Zend Framework: Connecting to 2 databases written by Wenbert Del Rosario. Unfortunately though, while it was a good read and very informative, my circumstances were a little different. The database I had to connect to was hosted on an external network that I wouldn’t have local access to. I had to go over the wide Internet to connect. Using Wenbert’s tutorial would entail a constant connection on every request whether it needed it or not and that just wouldn’t do.

Anyone who’s ever had to connect to an external database can tell you it’s far from an ideal scenario. The latency involved with this strategy is noticeable and caching is pretty much required for any sane strategy. Still, Wenbert’s article was definitely helpful for developing my strategy.

As in Wenbert’s article I started with putting the connection information into my config.ini file

//config.ini
resources.db.adapter = PDO_MYSQL
resources.db.params.host = localhost
resources.db.params.username = ****
resources.db.params.password = ****
resources.db.params.dbname = local_db 
 
externaldb.adapter = PDO_MYSQL
externaldb.params.host = example.com
externaldb.params.username = ****
externaldb.params.password = ****
externaldb.params.dbname = external_db

Then, I created a standard Zend_Db_Table class that handled the connection:

<?php
//Model/DbTable/External.php
//name made up to protect the client but do yourself a favor and name your shit logically
class Model_DbTable_External extends Model_DbTable_Abstract
{
   /**
     * Doesn't matter for use but ZF demands a $_name variable
     * @var string
     */
	protected $_name = "content";
 
	public function init()
	{
		$settings = Zend_Registry::get('settings');
		$this->db = Zend_Db::factory($settings, $settings);
 
	}
}
?>

With the above out of the way I can now connect to the external database. I just have to pipe all my SQL through the above class. To that end, and to provide a nice wrapper for the caching, I create a Model that has methods for all the SQL my app will need:

<?php
class Model_External extends Model_Abstract
{
	/**
	 * The key to use for the cache items
	 * @var string
	 */
	public $cache_key = 'external_db';
 
	public function __construct()
	{
		parent::__construct();
		$db = new Model_DbTable_External;
		$this->db = $db->db;
	}
 
	public function testSystem($id)
	{
		$key = $cache_key.__FUNCTION__.$id;
		if(!$data = $this->cache->load($key))
		{
			//just a random query with joins and whatnots to show  it's possible &#x1F60A;
			$sql = $this->db->select()->from(array('e' => 'example_table1'), array('e.id', 'e.title'))->where('e.id = ?', $id)->limit('6');
			$data = $this->db->fetchAll($sql);
			$this->cache->save($data, $key, $this->cache_key));
		}
		return $data;
	}
}
?>

So, using the above class I’ve abstracted out the SQL and provided caching through the below class Model_Abstract:

<?php
//models/Abstract.php
abstract class Model_Abstract
{
	/**
	 * The database object
	 * @var object
	 */
	public $db;
 
	/**
	 * The Cache Object
	 * @var object
	 */
	public $cache;
 
	/**
	 * The stored cache name
	 * @var string
	 */
	public $cache_key = null;
 
	public function __construct()
	{
		$c = new Model_Cache;
		// getting a Zend_Cache_Core object
		$this->cache = Zend_Cache::factory(
	                    'Core',
	                    'File',
	                    array('lifetime' => 720, 'automatic_serialization' => true),
	                    array('cache_dir' => '/path/to/cache/')
		);
	}
}
?>

Putting it all together I now have a mechanism to connect to an external database, perform queries against said database and can cache the results pretty easily and with minimal code. One of the advantages of this approach is that connections to the database only happen when needed instead of on every page request as in Wenbert’s article. So, for example, so long as the cached items exist then no database connection is created and no latency (outside of hitting the filesystem of course) is created.

Still though, there are ways to improve on this approach. For example, using APC instead of the file system for the caching would help as would increasing the cache lifetime from 2 hours to something more but, overall, the above should be a good starting point.