geoname data fails to import, value too long
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Ubuntu Geonames |
Fix Released
|
Undecided
|
Tim Kuhlman |
Bug Description
When running the import-geonames script I receive this error:
2015-11-17 22:49:20 INFO db-relation-changed BEGIN
2015-11-17 22:49:20 INFO db-relation-changed NOTICE: table "geoname_load" does not exist, skipping
2015-11-17 22:49:20 INFO db-relation-changed DROP TABLE
2015-11-17 22:49:20 INFO db-relation-changed CREATE TABLE
2015-11-17 22:50:06 INFO db-relation-changed ERROR: value too long for type character varying(60)
2015-11-17 22:50:06 INFO db-relation-changed CONTEXT: COPY geoname_load, line 10822213, column cc2: "AT,BE,
2015-11-17 22:50:06 INFO db-relation-changed ERROR: current transaction is aborted, commands ignored until end of transaction block
A possible fix is to expand the size of the cc2 column, but I am not sure if there are other implications to doing that.
Related branches
- Evan (community): Approve
-
Diff: 61 lines (+9/-9)2 files modifiedgeoname-modpython.py (+1/-1)
import-geonames.sh (+8/-8)
Changed in ubuntu-geonames: | |
assignee: | nobody → Tim Kuhlman (timkuhlman) |
Changed in ubuntu-geonames: | |
status: | New → Fix Committed |
status: | Fix Committed → Fix Released |
Unless there is a genuine upper bound you need to enforce, use 'text' instead of 'varchar(xx)'. With PostgreSQL the only difference internally is the enforcement of the length limit.