fix serialize_dag schema migration data column type#13269
fix serialize_dag schema migration data column type#13269houqp wants to merge 1 commit intoapache:masterfrom
Conversation
|
Should not we add another migration for that? I think the gennie is out of the bottle now, and in order to fix it for someone who already run the migration, we have to add a new migration. |
|
The Workflow run is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. |
|
Yup we will need to add another migration for this. Should we make the Column Type consistent with f66a46d#diff-fd7ce54d2146064dd83398f84ab648a0ffad06e941a76c53c82901257204f779 ? |
| conn.execute("SELECT JSON_VALID(1)").fetchone() | ||
| except (sa.exc.OperationalError, sa.exc.ProgrammingError): | ||
| json_type = sa.Text | ||
| json_type = sa.LargeBinary |
There was a problem hiding this comment.
Why not LargeText? (I.e. and this col be binary or text?)
|
After thinking more about this, we should just drop this change because in Airflow 2.x, all these dags will be stored as json type in MySQL (>5.6) or Postgres. So perhaps what we should do instead is to add a check in airflow-upgrade-check to see if existing column has been converted to json? |
What would have converted that type to JSON? (Users shouldn't ever have to manually edit the metadata db -- that's what migrations are for) |
|
@houqp Should we close this PR then? |
|
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. |
text column in mysql is too small for large DAGs, resulting in invalid json blob being stored.