'sqlite3 raised an error after running Airflow command line

When I ran command: airflow list_users It raised an error as below:

sqlite3.OperationalError: no such table: ab_permission_view_role

...

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: 
ab_permission_view_role [SQL: 'SELECT ab_permission_view_role.id AS 
ab_permission_view_role_id, ab_permission_view_role.permission_view_id AS ab_permission_view_role_permission_view_id, ab_permission_view_role.role_id AS 
ab_permission_view_role_role_id \nFROM ab_permission_view_role JOIN 
ab_permission_view ON ab_permission_view.id = 
ab_permission_view_role.permission_view_id JOIN ab_view_menu ON ab_view_menu.id = ab_permission_view.view_menu_id \nWHERE ab_permission_view_role.role_id = ? 
AND ab_permission_view.view_menu_id != ?'] [parameters: (4, 51)] (Background on 
this error at: http://sqlalche.me/e/e3q8)

There is also the same error after running: airflow create_user



Solution 1:[1]

[Airflow v1] This happened because ab_* tables were not created at airflow initdb. All these tables are for Role-based-access-control – RBAC.

To have these tables, follow the instructions:

edit airflow.cfg

[webserver]
rbac = True

and run airflow initdb to create these missed tables.

Solution 2:[2]

In addition to the answer by Newton Jose, after editing the cfg file, start the webserver using

airflow webserver

Then open another terminal, switch to your working directory and run

airflow initdb

You can now start your scheduler

airflow scheduler

The bottomline is that your webserver should be running when you run the command for initializing the database. Atleast, that is what worked for me.

Solution 3:[3]

You need to perform initialization after installation:

$ export AIRFLOW_HOME=~/airflow
$ airflow initdb

If AIRFLOW_HOME is unset, ~/airflow/ will be created and used. This is where the config and logs will be stored; if you want to reset the configuration, remove the dir stored in AIRFLOW_HOME and rerun airflow initdb.

Now other commands should work, e.g.

$ airflow version
[2019-08-15 22:39:34,673] {__init__.py:51} INFO - Using executor SequentialExecutor
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/  v1.10.4

Source: Installation section from airflow docs.

Solution 4:[4]

I had a similar error even after I had initialized the db; I could go on my webserver, run airflow db shell, and see that the tables actually existed. In my case it was because I was launching Airflow from docker-compose and had just added an environment variable to my docker-compose.yml:

    <<: *airflow-common
    command: celery worker
    environment:
      MY_ENV_VAR: 100

By specifying an environment variable, I overrode the environment variables from my airflow-common alias -- including

    environment:
      AIRFLOW__CORE__EXECUTOR: CeleryExecutor

This caused my Airflow workers to restart as soon as they tried the celery worker command. That was the ultimate source of the "no such table" errors. I resolved this by defining and including an alias just for those shared env variables:

  environment:
    &airflow-common-env
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
  ...
    <<: *airflow-common
    environment:
      <<: *airflow-common-env
      MY_ENV_VAR: 100

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Xceptions
Solution 3 Abhishek Shingadiya
Solution 4 Noumenon