Использование нескольких баз данных

Этот документ описывает функционал Django для работы с несколькими базами данных. Большая часть документации Django предполагает, что вы работаете с единственной базой данных. Если вам требуется работать с несколькими базами данных, вам понадобится выполнить ряд дополнительных действий.

Определение ваших баз данных

Первым шагом к использованию нескольких баз данных с Django будет определение серверов БД, которые вы планируете использовать. Это выполняется с помощью параметра конфигурации DATABASES. Этот параметр привязывает к базам данных псевдонимы, по которым эти базы будут доступны в Django и словари параметров с характеристиками подключения к ним. Эти дополнительные параметры полностью описаны в документации на DATABASES.

Базам данных можно назначать любой псевдоним. Тем не менее, псевдоним default имеет особое значение. Django использует базу данных с псевдонимом default, если явно не указано использование другой базы данных.

Далее показан пример settings.py, в котором определяются две базы данных – стандартная БД PostgreSQL и БД MySQL с псевдонимом users:

DATABASES = {
    'default': {
        'NAME': 'app_data',
        'ENGINE': 'django.db.backends.postgresql_psycopg2',
        'USER': 'postgres_user',
        'PASSWORD': 's3krit'
    },
    'users': {
        'NAME': 'user_data',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_user',
        'PASSWORD': 'priv4te'
    }
}

Если концепция стандартной (default) базы данных не соответствует контексту вашего проекта, вам потребуется быть аккуратным, каждый раз указывая какую именно базу данных следует использовать в каждом случае. Django требует наличия записи default в конфигурации баз данных, но этот параметр можно оставить пустым, если вы не планируете его использовать. Следующий пример settings.py определяет две дополнительных базы данных, оставляя запись default пустой:

DATABASES = {
    'default': {},
    'users': {
        'NAME': 'user_data',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_user',
        'PASSWORD': 'superS3cret'
    },
    'customers': {
        'NAME': 'customer_data',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_cust',
        'PASSWORD': 'veryPriv@ate'
    }
}

Если вы попробуете получить доступ к базе данных, которая не определена в параметре конфигурации DATABASES, то Django выбросит исключение django.db.utils.ConnectionDoesNotExist.

Синхронизация ваших баз данных

Команда syncdb работает только с одной базой данных. По умолчанию, она работает с базой данных default, но добавив аргумент --database, вы можете указать команде, что надо работать с другой базой данных. Таким образом, для того, чтобы синхронизировать все модели со всеми базами данных, вам потребуется сделать:

$ ./manage.py syncdb
$ ./manage.py syncdb --database=users

Если не хотите, чтобы модели приложения синхронизировались в конкретную базу данных, вы можете определить роутер, который будет управлять размещением моделей по базам данных

Аналогично, если вам нужен более тонкий контроль над синхронизацией, вы можете перенаправить весь или часть потока от команды sqlall для определённого приложения прямо в командную строку базы данных, вот так:

$ ./manage.py sqlall sales | ./manage.py dbshell

Использование других команд

Остальные команды django-admin.py, работающие с базой данных, аналогичны syncdb – т.е., только с одной базой данных за раз. Добавляйте --database, чтобы указать какую базу данных следует использовать.

Автоматический роутинг для баз данных

Простейшим способом использования нескольких баз данных является настройка схемы роутинга. Стандартная схема роутинга проверяет, что объекты привязаны к их оригинальной базе данных (т.е., объект, полученный из базы данных foo, будет сохранён в ту же базу данных). Стандартная схема роутинга проверяет, что если база данных не указана, то все запросы направляются к базе данных default.

Для активации стандартной схемы роутинга делать ничего не надо. Она уже настроена для каждого проекта Django. Тем не менее, если вы желаете реализовать более сложное поведение при выборе базы данных, вы можете определить и установить собственные роутеры для баз данных.

Роутеры баз данных

Роутер базы данных является классом, которые предоставляет четыре метода:

db_for_read(model, **hints)

Выбирает базу данных, которая должна использоваться для операций чтения для объектов типа model.

Если операция над базой данных может предоставить любую дополнительную информацию, которая может помочь в выборе базы данных, она будет предоставлена в словаре hints (подсказки). Подробности на правильные подсказки предоставлена ref:далее <topics-db-multi-db-hints>.

Возвращает None, если ничего не может предложить.

db_for_write(model, **hints)

Suggest the database that should be used for writes of objects of type Model.

Если операция над базой данных может предоставить любую дополнительную информацию, которая может помочь в выборе базы данных, она будет предоставлена в словаре hints (подсказки). Подробности на правильные подсказки предоставлена ref:далее <topics-db-multi-db-hints>.

Возвращает None, если ничего не может предложить.

allow_relation(obj1, obj2, **hints)

Возвращает True, если связь между obj1 и obj2 должна быть разрешена, False, если связь запрещена и None, если у роутера нет идей на этот счёт. Это обычная операция проверки, использующаяся операциями с внешними ключами и M2M, для определения возможности организации связи между двумя объектами.

allow_syncdb(db, model)

Determine if the model should be synchronized onto the database with alias db. Return True if the model should be synchronized, False if it should not be synchronized, or None if the router has no opinion. This method can be used to determine the availability of a model on a given database.

A router doesn’t have to provide all these methods – it may omit one or more of them. If one of the methods is omitted, Django will skip that router when performing the relevant check.

Hints

The hints received by the database router can be used to decide which database should receive a given request.

At present, the only hint that will be provided is instance, an object instance that is related to the read or write operation that is underway. This might be the instance that is being saved, or it might be an instance that is being added in a many-to-many relation. In some cases, no instance hint will be provided at all. The router checks for the existence of an instance hint, and determine if that hint should be used to alter routing behavior.

Using routers

Database routers are installed using the DATABASE_ROUTERS setting. This setting defines a list of class names, each specifying a router that should be used by the master router (django.db.router).

The master router is used by Django’s database operations to allocate database usage. Whenever a query needs to know which database to use, it calls the master router, providing a model and a hint (if available). Django then tries each router in turn until a database suggestion can be found. If no suggestion can be found, it tries the current _state.db of the hint instance. If a hint instance wasn’t provided, or the instance doesn’t currently have database state, the master router will allocate the default database.

An example

Example purposes only!

This example is intended as a demonstration of how the router infrastructure can be used to alter database usage. It intentionally ignores some complex issues in order to demonstrate how routers are used.

This example won’t work if any of the models in myapp contain relationships to models outside of the other database. Cross-database relationships introduce referential integrity problems that Django can’t currently handle.

The master/slave configuration described is also flawed – it doesn’t provide any solution for handling replication lag (i.e., query inconsistencies introduced because of the time taken for a write to propagate to the slaves). It also doesn’t consider the interaction of transactions with the database utilization strategy.

So - what does this mean in practice? Let’s consider another sample configuration. This one will have several databases: one for the auth application, and all other apps using a master/slave setup with two read slaves. Here are the settings specifying these databases:

DATABASES = {
    'auth_db': {
        'NAME': 'auth_db',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_user',
        'PASSWORD': 'swordfish',
    },
    'master': {
        'NAME': 'master',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_user',
        'PASSWORD': 'spam',
    },
    'slave1': {
        'NAME': 'slave1',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_user',
        'PASSWORD': 'eggs',
    },
    'slave2': {
        'NAME': 'slave2',
        'ENGINE': 'django.db.backends.mysql',
        'USER': 'mysql_user',
        'PASSWORD': 'bacon',
    },
}

Now we’ll need to handle routing. First we want a router that knows to send queries for the auth app to auth_db:

class AuthRouter(object):
    """
    A router to control all database operations on models in the
    auth application.
    """
    def db_for_read(self, model, **hints):
        """
        Attempts to read auth models go to auth_db.
        """
        if model._meta.app_label == 'auth':
            return 'auth_db'
        return None

    def db_for_write(self, model, **hints):
        """
        Attempts to write auth models go to auth_db.
        """
        if model._meta.app_label == 'auth':
            return 'auth_db'
        return None

    def allow_relation(self, obj1, obj2, **hints):
        """
        Allow relations if a model in the auth app is involved.
        """
        if obj1._meta.app_label == 'auth' or \
           obj2._meta.app_label == 'auth':
           return True
        return None

    def allow_syncdb(self, db, model):
        """
        Make sure the auth app only appears in the 'auth_db'
        database.
        """
        if db == 'auth_db':
            return model._meta.app_label == 'auth'
        elif model._meta.app_label == 'auth':
            return False
        return None

And we also want a router that sends all other apps to the master/slave configuration, and randomly chooses a slave to read from:

import random

class MasterSlaveRouter(object):
    def db_for_read(self, model, **hints):
        """
        Reads go to a randomly-chosen slave.
        """
        return random.choice(['slave1', 'slave2'])

    def db_for_write(self, model, **hints):
        """
        Writes always go to master.
        """
        return 'master'

    def allow_relation(self, obj1, obj2, **hints):
        """
        Relations between objects are allowed if both objects are
        in the master/slave pool.
        """
        db_list = ('master', 'slave1', 'slave2')
        if obj1.state.db in db_list and obj2.state.db in db_list:
            return True
        return None

    def allow_syncdb(self, db, model):
        """
        All non-auth models end up in this pool.
        """
        return True

Finally, in the settings file, we add the following (substituting path.to. with the actual python path to the module(s) where the routers are defined):

DATABASE_ROUTERS = ['path.to.AuthRouter', 'path.to.MasterSlaveRouter']

The order in which routers are processed is significant. Routers will be queried in the order the are listed in the DATABASE_ROUTERS setting . In this example, the AuthRouter is processed before the MasterSlaveRouter, and as a result, decisions concerning the models in auth are processed before any other decision is made. If the DATABASE_ROUTERS setting listed the two routers in the other order, MasterSlaveRouter.allow_syncdb() would be processed first. The catch-all nature of the MasterSlaveRouter implementation would mean that all models would be available on all databases.

With this setup installed, lets run some Django code:

>>> # This retrieval will be performed on the 'auth_db' database
>>> fred = User.objects.get(username='fred')
>>> fred.first_name = 'Frederick'

>>> # This save will also be directed to 'auth_db'
>>> fred.save()

>>> # These retrieval will be randomly allocated to a slave database
>>> dna = Person.objects.get(name='Douglas Adams')

>>> # A new object has no database allocation when created
>>> mh = Book(title='Mostly Harmless')

>>> # This assignment will consult the router, and set mh onto
>>> # the same database as the author object
>>> mh.author = dna

>>> # This save will force the 'mh' instance onto the master database...
>>> mh.save()

>>> # ... but if we re-retrieve the object, it will come back on a slave
>>> mh = Book.objects.get(title='Mostly Harmless')

Manually selecting a database

Django also provides an API that allows you to maintain complete control over database usage in your code. A manually specified database allocation will take priority over a database allocated by a router.

Manually selecting a database for a QuerySet

You can select the database for a QuerySet at any point in the QuerySet “chain.” Just call using() on the QuerySet to get another QuerySet that uses the specified database.

using() takes a single argument: the alias of the database on which you want to run the query. For example:

>>> # This will run on the 'default' database.
>>> Author.objects.all()

>>> # So will this.
>>> Author.objects.using('default').all()

>>> # This will run on the 'other' database.
>>> Author.objects.using('other').all()

Selecting a database for save()

Use the using keyword to Model.save() to specify to which database the data should be saved.

For example, to save an object to the legacy_users database, you’d use this:

>>> my_object.save(using='legacy_users')

If you don’t specify using, the save() method will save into the default database allocated by the routers.

Moving an object from one database to another

If you’ve saved an instance to one database, it might be tempting to use save(using=...) as a way to migrate the instance to a new database. However, if you don’t take appropriate steps, this could have some unexpected consequences.

Consider the following example:

>>> p = Person(name='Fred')
>>> p.save(using='first')  # (statement 1)
>>> p.save(using='second') # (statement 2)

In statement 1, a new Person object is saved to the first database. At this time, p doesn’t have a primary key, so Django issues a SQL INSERT statement. This creates a primary key, and Django assigns that primary key to p.

When the save occurs in statement 2, p already has a primary key value, and Django will attempt to use that primary key on the new database. If the primary key value isn’t in use in the second database, then you won’t have any problems – the object will be copied to the new database.

However, if the primary key of p is already in use on the second database, the existing object in the second database will be overridden when p is saved.

You can avoid this in two ways. First, you can clear the primary key of the instance. If an object has no primary key, Django will treat it as a new object, avoiding any loss of data on the second database:

>>> p = Person(name='Fred')
>>> p.save(using='first')
>>> p.pk = None # Clear the primary key.
>>> p.save(using='second') # Write a completely new object.

The second option is to use the force_insert option to save() to ensure that Django does a SQL INSERT:

>>> p = Person(name='Fred')
>>> p.save(using='first')
>>> p.save(using='second', force_insert=True)

This will ensure that the person named Fred will have the same primary key on both databases. If that primary key is already in use when you try to save onto the second database, an error will be raised.

Selecting a database to delete from

By default, a call to delete an existing object will be executed on the same database that was used to retrieve the object in the first place:

>>> u = User.objects.using('legacy_users').get(username='fred')
>>> u.delete() # will delete from the `legacy_users` database

To specify the database from which a model will be deleted, pass a using keyword argument to the Model.delete() method. This argument works just like the using keyword argument to save().

For example, if you’re migrating a user from the legacy_users database to the new_users database, you might use these commands:

>>> user_obj.save(using='new_users')
>>> user_obj.delete(using='legacy_users')

Using managers with multiple databases

Use the db_manager() method on managers to give managers access to a non-default database.

For example, say you have a custom manager method that touches the database – User.objects.create_user(). Because create_user() is a manager method, not a QuerySet method, you can’t do User.objects.using('new_users').create_user(). (The create_user() method is only available on User.objects, the manager, not on QuerySet objects derived from the manager.) The solution is to use db_manager(), like this:

User.objects.db_manager('new_users').create_user(...)

db_manager() returns a copy of the manager bound to the database you specify.

Using get_query_set() with multiple databases

If you’re overriding get_query_set() on your manager, be sure to either call the method on the parent (using super()) or do the appropriate handling of the _db attribute on the manager (a string containing the name of the database to use).

For example, if you want to return a custom QuerySet class from the get_query_set method, you could do this:

class MyManager(models.Manager):
    def get_query_set(self):
        qs = CustomQuerySet(self.model)
        if self._db is not None:
            qs = qs.using(self._db)
        return qs

Exposing multiple databases in Django’s admin interface

Django’s admin doesn’t have any explicit support for multiple databases. If you want to provide an admin interface for a model on a database other than that specified by your router chain, you’ll need to write custom ModelAdmin classes that will direct the admin to use a specific database for content.

ModelAdmin objects have five methods that require customization for multiple-database support:

class MultiDBModelAdmin(admin.ModelAdmin):
    # A handy constant for the name of the alternate database.
    using = 'other'

    def save_model(self, request, obj, form, change):
        # Tell Django to save objects to the 'other' database.
        obj.save(using=self.using)

    def delete_model(self, request, obj):
        # Tell Django to delete objects from the 'other' database
        obj.delete(using=self.using)

    def queryset(self, request):
        # Tell Django to look for objects on the 'other' database.
        return super(MultiDBModelAdmin, self).queryset(request).using(self.using)

    def formfield_for_foreignkey(self, db_field, request=None, **kwargs):
        # Tell Django to populate ForeignKey widgets using a query
        # on the 'other' database.
        return super(MultiDBModelAdmin, self).formfield_for_foreignkey(db_field, request=request, using=self.using, **kwargs)

    def formfield_for_manytomany(self, db_field, request=None, **kwargs):
        # Tell Django to populate ManyToMany widgets using a query
        # on the 'other' database.
        return super(MultiDBModelAdmin, self).formfield_for_manytomany(db_field, request=request, using=self.using, **kwargs)

The implementation provided here implements a multi-database strategy where all objects of a given type are stored on a specific database (e.g., all User objects are in the other database). If your usage of multiple databases is more complex, your ModelAdmin will need to reflect that strategy.

Inlines can be handled in a similar fashion. They require three customized methods:

class MultiDBTabularInline(admin.TabularInline):
    using = 'other'

    def queryset(self, request):
        # Tell Django to look for inline objects on the 'other' database.
        return super(MultiDBTabularInline, self).queryset(request).using(self.using)

    def formfield_for_foreignkey(self, db_field, request=None, **kwargs):
        # Tell Django to populate ForeignKey widgets using a query
        # on the 'other' database.
        return super(MultiDBTabularInline, self).formfield_for_foreignkey(db_field, request=request, using=self.using, **kwargs)

    def formfield_for_manytomany(self, db_field, request=None, **kwargs):
        # Tell Django to populate ManyToMany widgets using a query
        # on the 'other' database.
        return super(MultiDBTabularInline, self).formfield_for_manytomany(db_field, request=request, using=self.using, **kwargs)

Once you’ve written your model admin definitions, they can be registered with any Admin instance:

from django.contrib import admin

# Specialize the multi-db admin objects for use with specific models.
class BookInline(MultiDBTabularInline):
    model = Book

class PublisherAdmin(MultiDBModelAdmin):
    inlines = [BookInline]

admin.site.register(Author, MultiDBModelAdmin)
admin.site.register(Publisher, PublisherAdmin)

othersite = admin.AdminSite('othersite')
othersite.register(Publisher, MultiDBModelAdmin)

This example sets up two admin sites. On the first site, the Author and Publisher objects are exposed; Publisher objects have an tabular inline showing books published by that publisher. The second site exposes just publishers, without the inlines.

Using raw cursors with multiple databases

If you are using more than one database you can use django.db.connections to obtain the connection (and cursor) for a specific database. django.db.connections is a dictionary-like object that allows you to retrieve a specific connection using its alias:

from django.db import connections
cursor = connections['my_db_alias'].cursor()

Limitations of multiple databases

Cross-database relations

Django doesn’t currently provide any support for foreign key or many-to-many relationships spanning multiple databases. If you have used a router to partition models to different databases, any foreign key and many-to-many relationships defined by those models must be internal to a single database.

This is because of referential integrity. In order to maintain a relationship between two objects, Django needs to know that the primary key of the related object is valid. If the primary key is stored on a separate database, it’s not possible to easily evaluate the validity of a primary key.

If you’re using Postgres, Oracle, or MySQL with InnoDB, this is enforced at the database integrity level – database level key constraints prevent the creation of relations that can’t be validated.

However, if you’re using SQLite or MySQL with MyISAM tables, there is no enforced referential integrity; as a result, you may be able to ‘fake’ cross database foreign keys. However, this configuration is not officially supported by Django.

Behavior of contrib apps

Several contrib apps include models, and some apps depend on others. Since cross-database relationships are impossible, this creates some restrictions on how you can split these models across databases:

  • each one of contenttypes.ContentType, sessions.Session and sites.Site can be stored in any database, given a suitable router.
  • auth models — User, Group and Permission — are linked together and linked to ContentType, so they must be stored in the same database as ContentType.
  • admin and comments depend on auth, so their models must be in the same database as auth.
  • flatpages and redirects depend on sites, so their models must be in the same database as sites.

In addition, some objects are automatically created just after syncdb creates a table to hold them in a database:

  • a default Site,
  • a ContentType for each model (including those not stored in that database),
  • three Permission for each model (including those not stored in that database).

For common setups with multiple databases, it isn’t useful to have these objects in more than one database. Common setups include master / slave and connecting to external databases. Therefore, it’s recommended:

  • either to run syncdb only for the default database;
  • or to write database router that allows synchronizing these three models only to one database.

Предупреждение

If you’re synchronizing content types to more that one database, be aware that their primary keys may not match across databases. This may result in data corruption or data loss.