'R ODBC DBI error: database does not exist
I'm working on a Mac and trying to connect to my remote Postgres database in R Studio.
Error
First, the error: Error: nanodbc/nanodbc.cpp:1021: 00000: FATAL: database "db_name" does not exist
Steps taken
I first installed the necessary dependencies using homebrew:
brew install unixodbc
brew install psqlodbc
Then set up my .ini files to contain the following:
odbcinst.ini
:
[PostgreSQL Driver]
Driver = /usr/local/lib/psqlodbcw.so
odbc.ini
:
[PostgreSQL]
Driver = PostgreSQL Driver
Database = test_db
Servername = localhost
UserName = postgres
Password = password
Port = 5432
In R, running the following line, to check if my driver is installed correctly, produces the expected result:
> odbc::odbcListDrivers()
name attribute value
1 PostgreSQL Driver Driver /usr/local/lib/psqlodbcw.so
I then try to connect to the database (actual names/host/pw changed), which then produces an error that the database "does not exist". All the parameters are correct, so I'm not sure what's going on. Could it be some kind of firewall/ssh protection I've set on the server side? Any help would be appreciated, I am but a humble biologist who is a bit out of my depth here.
> con <- dbConnect(odbc::odbc(),
+ driver = "PostgreSQL Driver",
+ database = "db_name",
+ uid = "db_user",
+ pwd = "db_pass",
+ host = "123.456.78.910",
+ port = 5432)
Error: nanodbc/nanodbc.cpp:1021: 00000: FATAL: database "db_name" does not exist
Solution 1:[1]
A much later answer to my own question: after shelving this project for a few months, I came back to it and tried a different driver approach using the RPostgres
package. Specifically, rather than messing around with the odbc
driver and .ini file, I instead tried RPostgres::Postgres()
.
drv <- RPostgres::Postgres()
db <- DBI::dbConnect(drv = drv,
user = "user",
password = "pwd",
dbname = "dbname",
host = "123.456.78.910")
DBI::dbGetQuery(db, "select * from my_table")
Solution 2:[2]
First it seems your pg database is local on your mac (localhost, same for me), you don't have then to specify your IP in R.
What happen if you type this? Because your DSN for db_test / db_name is [PostgreSQL] in reality (.odbc.ini).
library(DBI)
con_odbc_to_db_name <- dbConnect(odbc::odbc(), "PostgreSQL")
or if it doesn't work maybe test the same with "db_name", "db_test" / "test_db". But I think your problem is just that "PostgreSQL" name in odbc.ini.
For a more complete answer on this, and to do sort of a memo pad.
PostgreSQL odbc and DSN definitions on a Mac.
For odbc, there are two files:
- odbcinst.ini in /usr/local/etc/odbcinst.ini to specify driver name (you choose)
- .odbc.ini in your $HOME (~), to be created first time to specify data source names (odbc DSN)
nano /usr/local/etc/odbcinst.ini
You have to correctly indicate the driver name ODBC somewhere (and use it after in .odbc.ini).
For a complete overview of this setup on a Mac, there is a very good blog post on these odbc/obdcinst, see here.
You will see that you have coherent [Driver] between odbcinst.ini file and the driver that is specified in .odbc.ini Driver = ...
for your DSN
# nano /usr/local/etc/odbcinst.ini
[PostgreSQL Unicode]
Description = PostgreSQL ODBC driver (Unicode version)
Driver = psqlodbcw.so
Debug = 0
CommLog = 1
UsageCount = 1
nano ~/.odbc.ini
With DBI/odbc packages you can pass connections parameters inside .odbc.ini for each database, for instance:
Note that you have to put your db name inside [...]
(ODBC Data Source Name) at the beginning of each definition.
# .odbc.ini contains:
[db_name]
Driver = PostgreSQL Unicode
ServerName = localhost
Port = 5432
Database = db_name
Username = usr
Password = pwd
Protocol = 13.0.3
Debug = 1
[db_test]
Driver = PostgreSQL Unicode
ServerName = localhost
Port = 5432
Database = db_test
Username = usr
Password = pwd
Protocol = 13.0.3
Debug = 1
test connection in RStudio
Your database then will appear in Rstudio Connections Pane like this, for me my database is "pmsi" here instead of "db_name".
After that, to connect in R, it's easier: you just have to click on your db_name row in this window, or more precisely, type this:
library(DBI)
con_odbc_to_db_name <- dbConnect(odbc::odbc(), "db_name")
And about RPostgres package
However I've found personally that writing big tables (many row, many cols) in pg with odbc
/DBI
dbWriteTable is a bit slow. For this I use RPostgres
package which uses \COPY
to "bulk" insert raw datas:
con_to_db_name <- RPostgres::dbConnect(RPostgres::Postgres(), "db_name")
# change default schema after that if neeeded
DBI::dbSendQuery(cidd2, "SET search_path = db_preferred_schema, public;")
And then use the good dbWriteTable
from RPosgres
:
RPostgres::dbWriteTable(con_to_db_name, "db_table_a", table_a, append = TRUE)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | |
Solution 2 |