Unfortunately you can't depend on makemigrations
to generate the correct SQL to migrate and cast data from a scalar field to a PostgreSQL ARRAY. But Django provides a nifty RunSQL
that's also described in this post, "Down and Dirty - 9/25/2013" by Aeracode, the original creator of South predecessor of Django migrations. That post even mentions using RunSQL
to alter a column using CAST.
The issue and trick to migrating a column to an ArrayField
is given by PostgreSQL in the traceback, which says:
Further hints can be found by rtfm and searching the internet, such this stackoverflow Q&A. My procedure was to usecolumn "my_field " cannot be cast automatically to type double precision[] HINT: Specify a USING expression to perform the conversion.
makemigrations
to get the state_operations
and then wrap each one into a RunSQL
migration operation.
# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import migrations, models import datetime from django.utils.timezone import utc import django.contrib.postgres.fields import simengapi_app.models import django.core.validators class Migration(migrations.Migration): dependencies = [ ('my_app', '0XYZ_auto_YYYYMMDD_hhmm'), ] operations = [ migrations.RunSQL( """ ALTER TABLE my_app_mymodel ALTER COLUMN "my_field" TYPE double precision[] USING array["my_field"]::double precision[]; """, state_operations=[ migrations.AlterField( model_name='mymodel', name='my_field', field=django.contrib.postgres.fields.ArrayField( base_field=models.FloatField(), default=list, verbose_name=b'my field', size=None ), ) ], ), ]