Git Product home page Git Product logo

pygtfs's Introduction

pygtfs

Travis PyPI

Overview

pygtfs is a library that models information stored in Google's General Transit Feed Specification (GTFS) format. GTFS is a format designed to specify information about a transit system, such as a city's subways or a private company's bus services. pygtfs stores information in an SQLite database using SQLAlchemy to facilitate the storage of Python objects in a relational database.

pygtfs is a fork of @eoghanmurray's fork of a @andrewblim's gtfs-sql which is a fork of @bmander's gtfs. See the git logs for more fun history.

License: MIT, included in license.txt.

Dependencies

  • SQLAlchemy 0.7.8. Used for all mapping of GTFS objects to the relational DB. You'll need to be familiar with it to read the code; the documentation is pretty solid.
  • pytz 2012d. A few GTFS fields are expected to be in a tz time zone format.
  • six. Used in order to support python2 and python3 in a single code base.
  • docopt. Pythonic command line arguments parser, that will make you smile

Installation

Get setuptools if you don't have it, clone the repo, and use python setup.py install.

Documentation

Hosted on https://pygtfs.readthedocs.org/

TODO

  • Improve testing; add some unit testing framework and test with a variety of GTFS data feeds.
  • Add more docs

Why fork?

  • natively support several gtfs feeds per database
  • less SLOC, more DRY
  • add python3 support
  • renamed to a more generic name
  • will continue to maintain

pygtfs's People

Contributors

alex-zach avatar andrewblim avatar bertware avatar bmander avatar cmb avatar eoghanmurray avatar evyatarc avatar irees avatar jarondl avatar leamingrad avatar mazzy89 avatar mileserickson avatar molisani avatar peeterskris avatar spencerrecneps avatar spresse1 avatar tomkv avatar vingerha avatar wlach avatar youtux avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

pygtfs's Issues

IntegrityError

Would you mind helping me with the error below when I try to look at MTA subway data? The link to the GTFS data is https://datamine-history.s3.amazonaws.com/gtfs-2014-09-17-09-31

IntegrityError: (sqlite3.IntegrityError) UNIQUE constraint failed: shapes.feed_id, shapes.shape_id [SQL: 'INSERT INTO shapes (feed_id, shape_id, shape_pt_lat, shape_pt_lon, shape_pt_sequence, shape_dist_traveled) VALUES (?, ?, ?, ?, ?, ?)'] [parameters: ((1, '4..N06R', 40.668897, -73.932942, '0', None), (1, '4..N06R', 40.669399, -73.942161, '1', None), (1, '4..N06R', 40.669847, -73.950466, '2', None), (1, '4..N06R', 40.669922, -73.951706, '3', None), (1, '4..N06R', 40.670083, -73.954713, '4', None), (1, '4..N06R', 40.670096, -73.954983, '5', None), (1, '4..N06R', 40.670118, -73.955258, '6', None), (1, '4..N06R', 40.670157, -73.955542, '7', None) ... displaying 10 of 5001 total bound parameter sets ... (1, 'L..S01R', 40.683552, -73.905577, '260', None), (1, 'L..S01R', 40.683429, -73.905534, '261', None))]

route_type range error from Prague's feed

Hi, I've been trying to get GTFS working via the homeassistant component which uses pygtfs. I'm trying to use Prague's GTFS feed.

I'm on
Python 2.7.10 (default, Jul 14 2015, 19:46:27)
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)] on darwin

It doesn't have a lot of options, I just gave it the following parameters:

  • platform: gtfs
    name: transport
    origin: U49N198
    destination: U484Z2
    data: jrdata.zip

I also tried the human readable names for the stops and tried to unzip the jrdata and give it the folder name instead.. No dice.. Any ideas where the problem might be?

018-12-03 00:13:09 ERROR (MainThread) [homeassistant.components.sensor] Error while setting up platform gtfs
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/homeassistant/helpers/entity_platform.py", line 128, in _async_setup_platform
SLOW_SETUP_MAX_WAIT, loop=hass.loop)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/tasks.py", line 416, in wait_for
return fut.result()
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/homeassistant/components/sensor/gtfs.py", line 180, in setup_platform
pygtfs.append_feed(gtfs, os.path.join(gtfs_dir, data))
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/pygtfs/loader.py", line 89, in append_feed
instance = gtfs_class(feed_id=feed_id, **records_as_dict)
File "", line 4, in init
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/orm/state.py", line 424, in _initialize_instance
manager.dispatch.init_failure(self, args, kwargs)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/util/langhelpers.py", line 66, in exit
compat.reraise(exc_type, exc_value, exc_tb)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/util/compat.py", line 249, in reraise
raise value
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/orm/state.py", line 421, in _initialize_instance
return manager.original_init(*mixed[1:], **kwargs)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/ext/declarative/base.py", line 748, in _declarative_constructor
setattr(self, k, kwargs[k])
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/orm/attributes.py", line 229, in set
instance_dict(instance), value, None)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/orm/attributes.py", line 710, in set
value, old, initiator)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/orm/attributes.py", line 717, in fire_replace_event
state, value, previous, initiator or self.replace_token)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/sqlalchemy/orm/util.py", line 136, in set

return validator(state.obj(), key, value)
File "/Users/scstraus/.homeassistant/deps/lib/python/site-packages/pygtfs/gtfs_entities.py", line 66, in in_range
"{0} must be in range {1}, was {2}".format(key, int_choice, value))
pygtfs.exceptions.PygtfsValidationError: route_type must be in range range(0, 8), was 800

Load database from file?

Is there a way to convert the GTFS feed to a database and cache it to a file, so that every time after that I can load it instantly, rather than doing the GTFS -> database conversion (which takes over a minute) each time?

Filter out unwanted angencies to speed up feed update time?

I have a GTFS files with a bunch of different agencies, but I'm only interested in the data for one of them.
it'd be very useful for me if there was a mechanism to skip insertion of routes that don't belong to that specific agencies, and after all the routes I want are in and the rest are out, skip insertion of Trips, StopTimes and Shapes that belongs to routes which are not in the DB.

Publish latest version on PyPI?

Hi @jarondl, would you mind publishing the latest available version to PyPi? It's been quite a while since the last one and there's been some significant changes since. Thanks in advance!

GTFS format extensions

Some GTFS providers seem to have started using extensions to the format. For example, Toronto's GO Transit adds one more value to the list of valid location_type's.

Extensions list is here:

https://support.google.com/transitpartners/answer/2450962?hl=en

Fix for this specific issue is:

gtfs_entities.py: line 166, change list from [None,0,1] to [None,0,1,2]

I added some additional info to the _validate_int_choice function as well, to assist with debugging:

line 60:

assert int_value in int_choice, "%s value outside limits: %r not in %s" % (key,int_value,int_choice[0:100])

Import fails when conditionally required route.agency_id is missing

When agency_id is not set on a route, the import fails. Example feed: https://www.bart.gov/sites/default/files/docs/google_transit_20211001_20220213_v1.zip

Traceback (most recent call last):

  File "/home/travis/virtualenv/python3.9.1/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1779, in _execute_context

    self.dialect.do_executemany(

  File "/home/travis/virtualenv/python3.9.1/lib/python3.9/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 962, in do_executemany

    context._psycopg2_fetched_rows = xtras.execute_values(

  File "/home/travis/virtualenv/python3.9.1/lib/python3.9/site-packages/psycopg2/extras.py", line 1270, in execute_values

    cur.execute(b''.join(parts))

psycopg2.errors.ForeignKeyViolation: insert or update on table "routes" violates foreign key constraint "routes_feed_id_fkey"

DETAIL:  Key (feed_id, agency_id)=(1, None) is not present in table "agency".

From the spec:

Field Name Type Presence Description
agency_id ID referencing agency.agency_id Conditionally Required Agency for the specified route.Conditionally Required:- Required if multiple agencies are defined in agency.txt. - Optional otherwise.

Suggested fix: set the agency_id automatically to the one included agency if no agency_id is defined

Output are print-statements, not logging... integration challenge

I am using the library a lot ..i.e. I have a growing set of end users in Home Assistant
As the data is often not (very) good, the logs help me to identify things quicker.
However, pygtfs is not sending out logs but print statements and these are not shown in the Home Assistant logs, also resulting in not showing errors when extracting.
Can you add a proper log output or is this too complex?

2024-05-09 09:14:13.750 DEBUG (MainThread) [custom_components.gtfs2.config_flow] Checkdata pygtfs: extracting with data: {'extract_from': 'url', 'file': 'vv', 'url': 'https://cdn.mbta.com/MBTA_GTFS.zip'}
Loading GTFS data for <class 'pygtfs.gtfs_entities.Agency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Stop'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Transfer'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Route'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Fare'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FareRule'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ShapePoint'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Service'>:
...
   for gtfs_class in gtfs_all:
        print('Loading GTFS data for %s:' % gtfs_class)
        gtfs_filename = gtfs_class.__tablename__ + '.txt'

TypeError when 'shape_dist_traveled' is empty

When pygtfs tries to parse the following line:

shape_id,shape_pt_lat,shape_pt_lon,shape_pt_sequence,shape_dist_traveled
149554,38.742155,-9.102203,1,

I get the following error:

Failure while writing Shapes(shape_id='149554', shape_pt_lat='38.742155', shape_pt_lon='-9.102203', shape_pt_sequence='1', shape_dist_traveled=None)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-3-44756ea8a7d9> in <module>
----> 1 pygtfs.append_feed(sched, "data")
...
~/.virtualenvs/thesis/lib/python3.7/site-packages/pygtfs/gtfs_entities.py in is_float_none(self, key, value)
     85     def is_float_none(self, key, value):
     86         try:
---> 87             return float(value)
     88         except ValueError:
     89             if value is None or value == "":

TypeError: float() argument must be a string or a number, not 'NoneType'

Looking at the is_float_none method, it seems you were expecting float(None) to raise a ValueError, but it is raising a TypeError instead.

Unicode issues with SNCF Transilien GTFS (Python 2.7)

gtfs2db dies with

UnicodeEncodeError: 'ascii' codec can't encode character u'\xe8' in position 16: ordinal not in range(128)

It looks like it's trying to "decode" a unicode object. I can provide more detailed info if you want to take this on. I don't want to fix it myself as I might break Python3 support.

The GTFS file I'm trying to consume is available here:
http://ressources.data.sncf.com/explore/dataset/sncf-horaires-des-lignes-transilien/?tab=metas

UNIQUE constraint failed: translations

Hi,

While importing https://stibmivb.opendatasoft.com/api/explore/v2.1/catalog/datasets/gtfs-files-production/alternative_exports/gtfszip/ I got a unique constraint error on the translation.txt import.
The issuer added twice the same trans_id and lang.
I found the faulty lines and could import the data, but at the next update I will get again an error if they don't correct the file.
Is there a way to ignore existing trans_id & lang combination in the same file?
I'm afraid that SQLAlchemy doesn't support "OR IGNORE" in session.add().

2402 records read for <class 'pygtfs.gtfs_entities.Translation'>.                                                                                                                                                                        
Traceback (most recent call last):                                                                                                                                                                                                       
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1942, in _exec_single_context                                                                                                                       
    context,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 919, in do_executemany
    cursor.executemany(statement, parameters)
sqlite3.IntegrityError: UNIQUE constraint failed: translations.feed_id, translations.trans_id, translations.lang
2402 records read for <class 'pygtfs.gtfs_entities.Translation'>.                                                                                                                                                                        
Traceback (most recent call last):                                                                                                                                                                                                       
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1942, in _exec_single_context                                                                                                                       
    context,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 919, in do_executemany
    cursor.executemany(statement, parameters)
sqlite3.IntegrityError: UNIQUE constraint failed: translations.feed_id, translations.trans_id, translations.lang
The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "sensor.py", line 321, in <module>
    init(clean)
  File "sensor.py", line 149, in init
    attributes = getGTFSAttributes()
  File "sensor.py", line 72, in getGTFSAttributes
    pygtfs = import_gtfs_files()
  File "sensor.py", line 66, in import_gtfs_files
    pygtfs.append_feed(gtfs, os.path.join(".", data))
  File "/home/pi/.local/lib/python3.7/site-packages/pygtfs/loader.py", line 121, in append_feed
    schedule.session.flush()
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 4310, in flush
    self._flush(objects)
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 4446, in _flush
    transaction.rollback(_capture_exception=True)
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 146, in __exit__
    raise exc_value.with_traceback(exc_tb)
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 4406, in _flush
    flush_context.execute()
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 466, in execute
    rec.execute(self)
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 645, in execute
    uow,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py", line 98, in save_obj
    insert,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py", line 1044, in _emit_insert_statements
    statement, multiparams, execution_options=execution_options
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1419, in execute
    execution_options or NO_OPTIONS,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 517, in _execute_on_connection
    self, distilled_params, execution_options
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1649, in _execute_clauseelement
    cache_hit=cache_hit,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1849, in _execute_context
    dialect, context, statement, parameters
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1989, in _exec_single_context
    e, str_statement, effective_parameters, cursor, context
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2343, in _handle_dbapi_exception
    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1942, in _exec_single_context
    context,
  File "/home/pi/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 919, in do_executemany
    cursor.executemany(statement, parameters)
sqlalchemy.exc.IntegrityError: (sqlite3.IntegrityError) UNIQUE constraint failed: translations.feed_id, translations.trans_id, translations.lang
[SQL: INSERT INTO translations (feed_id, trans_id, lang, translation) VALUES (?, ?, ?, ?)]
[parameters: [(1, 'STIB', 'fr', 'STIB'), (1, 'STIB', 'nl', 'MIVB'), (1, 'http://www.stib-mivb.be', 'fr', 'http://www.stib-mivb.be/index.htm?l=fr'), (1, 'http://www.stib-mivb.be', 'nl', 'http://www.stib-mivb.be/index.htm?l=nl'), (1, 'ABBAYE', 'fr', 'Abbaye'), (1, 'ABBAYE', 'nl', 'Abdij'), (1, 'ACACIAS', 'fr', 'Acacias'), (1, 'ACACIAS', 'nl', 'Acacia')  ... displaying 10 of 2402 total bound parameter sets ...  (1, 'ICHEC', 'fr', 'ICHEC'), (1, 'ICHEC', 'nl', 'ICHEC')]]
(Background on this error at: https://sqlalche.me/e/20/gkpj)

Thanks

Additional maintainers

Hi @jarondl, I'm sure it has been a thankless task to maintain this project for the last 8 years (if I'm reading this right). The repo is really valuable to the community. I came across this project as a dependency of Home Assistant and am very thankful for the work you've put into it.

Seeing as the project seems to be lagging behind the GTFS standard and PRs seem to be taking ~1 month to get reviewed, I think the bus factor for this project is too low.

Are you open to adding new maintainers who can help you out?

DeclarativeMeta object got multiple values for keyword argument 'feed_id'

First time using pygtfs, so apologies in advance if I've not yet caught onto a possibly obvious solution...

Using pygtfs 0.1.5 on Ubuntu 16.04 with SQLite 3.11.0, running gtfs2db append carta-sc-us.zip feed.db fails with TypeError: DeclarativeMeta object got multiple values for keyword argument 'feed_id' while writing FeedInfo.

Strange as there's only 1 instance of feed_id and its value is 213.

Not quite sure where to go from here. Any advice is greatly appreciated.

feed_info.txt Output
โžœ cat carta-sc-us/feed_info.txt
feed_publisher_url,feed_publisher_name,feed_lang,feed_version,feed_license,feed_contact_email,feed_contact_url,feed_start_date,feed_end_date,feed_id
http://www.trilliumtransit.com,"Trillium Solutions, Inc.",en,UTC: 16-May-2019 17:24,,[email protected],http://support.trilliumtransit.com,20190516,20200101,213
Console Output
โžœ gtfs2db append carta-sc-us.zip feed.db
Loading GTFS data for <class 'pygtfs.gtfs_entities.Agency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Stop'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Transfer'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Route'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Fare'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FareRule'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ShapePoint'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Service'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ServiceException'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Trip'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Frequency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.StopTime'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FeedInfo'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Translation'>:
1 record read for <class 'pygtfs.gtfs_entities.Agency'>.
881 records read for <class 'pygtfs.gtfs_entities.Stop'>.
2 records read for <class 'pygtfs.gtfs_entities.Transfer'>.
26 records read for <class 'pygtfs.gtfs_entities.Route'>.
6 records read for <class 'pygtfs.gtfs_entities.Fare'>.
64 records read for <class 'pygtfs.gtfs_entities.FareRule'>.
....44472 records read for <class 'pygtfs.gtfs_entities.ShapePoint'>.
8 records read for <class 'pygtfs.gtfs_entities.Service'>.
53 records read for <class 'pygtfs.gtfs_entities.ServiceException'>.
2381 records read for <class 'pygtfs.gtfs_entities.Trip'>.
2381 records read for <class 'pygtfs.gtfs_entities.Frequency'>.
......60829 records read for <class 'pygtfs.gtfs_entities.StopTime'>.
Failure while writing FeedInfo(feed_publisher_url='http://www.trilliumtransit.com', feed_publisher_name='Trillium Solutions, Inc.', feed_lang='en', feed_version='UTC: 16-May-2019 17:24', feed_start_date='20190516', feed_end_date='20200101', feed_id='213')
Traceback (most recent call last):
  File "/usr/local/bin/gtfs2db", line 9, in <module>
    load_entry_point('pygtfs==0.1.6.dev22+gbbc2691', 'console_scripts', 'gtfs2db')()
  File "/usr/local/lib/python3.5/dist-packages/pygtfs-0.1.6.dev22+gbbc2691-py3.5.egg/pygtfs/gtfs2db.py", line 55, in main
    chunk_size=int(args['--chunk-size']))
  File "/usr/local/lib/python3.5/dist-packages/pygtfs-0.1.6.dev22+gbbc2691-py3.5.egg/pygtfs/loader.py", line 84, in append_feed
    instance = gtfs_class(feed_id=feed_id, **record._asdict())
TypeError: DeclarativeMeta object got multiple values for keyword argument 'feed_id'

Location type should accept type '2'

Within gtfs_entities.py this line should be updated...
_validate_location = _validate_int_choice([None, 0, 1], 'location_type')

To
_validate_location = _validate_int_choice([None, 0, 1, 2], 'location_type')

The location_type field identifies whether this stop ID represents a stop, station, or station entrance. If no location type is specified, or the location_type is blank, stop IDs are treated as stops. Stations may have different properties from stops when they are represented on a map or used in trip planning. The location type field can have the following values:

0 or blank - Stop. A location where passengers board or disembark from a transit vehicle. | ย  | ย 
1 - Station. A physical structure or area that contains one or more stop. | ย  | ย 
2 - Station Entrance/Exit. A location where passengers can enter or exit a station from the street. The stop entry must also specify a parent_station value referencing the stop ID of the parent station for the entrance.**

Issue or Question or ?: large gtfs without calendar (only calendar_dates)

Hi,
I am remodelling the gtfs solution in HomeAssistant and have a use case with a large file from the NL, the sqlite turns into 7Gb.
As this dataset does not contain calendar entries, I need to rewrite the query to compensate for that and since sqlite does not allow outer joins I need to run it twice with an UNION ALL. Due to the large amount of data the query is pretty slow (db browser : 20-23 sec) and I was wondering if I could optimize this wqith indexes. This I will do myself but 2 questions:

  • can you easily add indexes for pytgts, else I need to add them to my solution which I would not prefer
  • would yo umaybe know of a way to construct the calendar with only calendar_dates entries? I may try myself but asking first

AssertionError: value outside limits - when loading in schedule

I am getting an error reading in the schedule.

I have successfully used this gtfs file for routing and service area creating via ArcMap, and I'd like to use your tool to add new routes to my analysis.

I'm unsure how to address the error.

ipython                   5.3.0                    py36_0
ipython_genutils          0.2.0                    py36_0

import pygtfs
sched = pygtfs.Schedule(":memory:")
pygtfs.append_feed(sched, 'full_greater_sydney_gtfs_static.zip')

The output is:

Loading GTFS data for <class 'pygtfs.gtfs_entities.Agency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Stop'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Route'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Trip'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.StopTime'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Service'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ServiceException'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Fare'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FareRule'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ShapePoint'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Frequency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Transfer'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FeedInfo'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Translation'>:
125 records read for <class 'pygtfs.gtfs_entities.Agency'>.
.......38991 records read for <class 'pygtfs.gtfs_entities.Stop'>.
Failure while writing Routes(route_id='1-11T-6-sj2-2', agency_id='700', route_short_name='11T6', route_long_name='Carlingford, then all stations to Clyde', route_desc='Temporary buses', route_type='714', route_color='00B5EF', route_text_color='FFFFFF')
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-9-5ce656f03730> in <module>()
----> 1 pygtfs.append_feed(sched, 'full_greater_sydney_gtfs_static.zip')

C:\ProgramData\Anaconda3\lib\site-packages\pygtfs\loader.py in append_feed(schedule, feed_filename, strip_fields, chunk_size, agency_id_override)
     87 
     88             try:
---> 89                 instance = gtfs_class(feed_id=feed_id, **records_as_dict)
     90             except:
     91                 print("Failure while writing {0}".format(record))

<string> in __init__(self, **kwargs)

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\orm\state.py in _initialize_instance(*mixed, **kwargs)
    412         except:
    413             with util.safe_reraise():
--> 414                 manager.dispatch.init_failure(self, args, kwargs)
    415 
    416     def get_history(self, key, passive):

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\util\langhelpers.py in __exit__(self, type_, value, traceback)
     64             self._exc_info = None   # remove potential circular references
     65             if not self.warn_only:
---> 66                 compat.reraise(exc_type, exc_value, exc_tb)
     67         else:
     68             if not compat.py3k and self._exc_info and self._exc_info[1]:

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\util\compat.py in reraise(tp, value, tb, cause)
    185         if value.__traceback__ is not tb:
    186             raise value.with_traceback(tb)
--> 187         raise value
    188 
    189 else:

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\orm\state.py in _initialize_instance(*mixed, **kwargs)
    409 
    410         try:
--> 411             return manager.original_init(*mixed[1:], **kwargs)
    412         except:
    413             with util.safe_reraise():

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\ext\declarative\base.py in _declarative_constructor(self, **kwargs)
    652                 "%r is an invalid keyword argument for %s" %
    653                 (k, cls_.__name__))
--> 654         setattr(self, k, kwargs[k])
    655 _declarative_constructor.__name__ = '__init__'
    656 

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\orm\attributes.py in __set__(self, instance, value)
    222     def __set__(self, instance, value):
    223         self.impl.set(instance_state(instance),
--> 224                       instance_dict(instance), value, None)
    225 
    226     def __delete__(self, instance):

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\orm\attributes.py in set(self, state, dict_, value, initiator, passive, check_old, pop)
    700         if self.dispatch.set:
    701             value = self.fire_replace_event(state, dict_,
--> 702                                             value, old, initiator)
    703         state._modified_event(dict_, self, old)
    704         dict_[self.key] = value

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\orm\attributes.py in fire_replace_event(self, state, dict_, value, previous, initiator)
    709                 state, value, previous,
    710                 initiator or self._replace_token or
--> 711                 self._init_append_or_replace_token())
    712         return value
    713 

C:\ProgramData\Anaconda3\lib\site-packages\sqlalchemy\orm\util.py in set_(state, value, oldvalue, initiator)
    116         def set_(state, value, oldvalue, initiator):
    117             if include_backrefs or not detect_is_backref(state, initiator):
--> 118                 return validator(state.obj(), key, value)
    119             else:
    120                 return value

C:\ProgramData\Anaconda3\lib\site-packages\pygtfs\gtfs_entities.py in in_range(self, key, value)
     59         else:
     60             int_value = int(value)
---> 61         assert int_value in int_choice, "value outside limits"
     62         return int_value
     63     return in_range

AssertionError: value outside limits

ValueError: could not convert string to float in gtfs_entities.py

Hi,

we're using pygtfs as part of Home Assistant [0.91.2] and have received the following error during setup:

File "/usr/local/lib/python3.7/site-packages/pygtfs/gtfs_entities.py", line 74, in in_range
    float_value = float(value)
ValueError: could not convert string to float: 

We are using the GTFS data set from VRN:
https://www.vrn.de/mam/service/downloads/vrn_gtfs.zip
and are trying to use a route between the following stops:
origin: "de:07332:1015"
destination: "de:07332:1009:1:Bus"

According to the documentation at https://www.home-assistant.io/components/gtfs/ we are opening this issue here.

Thank you for the work you do,
-Jan

Optionally specify schema

It would be nice to have the capability to optionally specify a schema to store the data in for databases that support schemas.

Unique constraint failed when trying to import the latest data from BART

Hello! While trying to import data from BART into Home Assistant using https://www.home-assistant.io/integrations/gtfs/ I ran into a SQLAlchemy error. I have confirmed that this error occurs with the latest pygtfs even when home assistant's code isn't in use:

gtfs2db append bart.zip bart.sqlite
Loading GTFS data for <class 'pygtfs.gtfs_entities.Agency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Stop'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Transfer'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Route'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Fare'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FareRule'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ShapePoint'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Service'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.ServiceException'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Trip'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Frequency'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.StopTime'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.FeedInfo'>:
Loading GTFS data for <class 'pygtfs.gtfs_entities.Translation'>:
1 record read for <class 'pygtfs.gtfs_entities.Agency'>.
182 records read for <class 'pygtfs.gtfs_entities.Stop'>.
99 records read for <class 'pygtfs.gtfs_entities.Transfer'>.
14 records read for <class 'pygtfs.gtfs_entities.Route'>.
2500 records read for <class 'pygtfs.gtfs_entities.Fare'>.
2500 records read for <class 'pygtfs.gtfs_entities.FareRule'>.
Traceback (most recent call last):
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1750, in _execute_context
    self.dialect.do_executemany(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/default.py", line 714, in do_executemany
    cursor.executemany(statement, parameters)
sqlite3.IntegrityError: UNIQUE constraint failed: transfers.feed_id, transfers.from_stop_id, transfers.to_stop_id

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/homebrew/bin/gtfs2db", line 8, in <module>
    sys.exit(main())
  File "/opt/homebrew/lib/python3.9/site-packages/pygtfs/gtfs2db.py", line 54, in main
    append_feed(schedule, args['<feed_file>'],
  File "/opt/homebrew/lib/python3.9/site-packages/pygtfs/loader.py", line 93, in append_feed
    schedule.session.flush()
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/session.py", line 3298, in flush
    self._flush(objects)
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/session.py", line 3438, in _flush
    transaction.rollback(_capture_exception=True)
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
    compat.raise_(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/session.py", line 3398, in _flush
    flush_context.execute()
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/unitofwork.py", line 456, in execute
    rec.execute(self)
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/unitofwork.py", line 630, in execute
    util.preloaded.orm_persistence.save_obj(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/persistence.py", line 242, in save_obj
    _emit_insert_statements(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/orm/persistence.py", line 1094, in _emit_insert_statements
    c = connection._execute_20(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1582, in _execute_20
    return meth(self, args_10style, kwargs_10style, execution_options)
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
    return connection._execute_clauseelement(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1451, in _execute_clauseelement
    ret = self._execute_context(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1813, in _execute_context
    self._handle_dbapi_exception(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1994, in _handle_dbapi_exception
    util.raise_(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1750, in _execute_context
    self.dialect.do_executemany(
  File "/opt/homebrew/lib/python3.9/site-packages/sqlalchemy/engine/default.py", line 714, in do_executemany
    cursor.executemany(statement, parameters)
sqlalchemy.exc.IntegrityError: (sqlite3.IntegrityError) UNIQUE constraint failed: transfers.feed_id, transfers.from_stop_id, transfers.to_stop_id
[SQL: INSERT INTO transfers (feed_id, from_stop_id, to_stop_id, transfer_type, min_transfer_time) VALUES (?, ?, ?, ?, ?)]
[parameters: ((1, 'MCAR', 'MCAR', 2, '20'), (1, 'MCAR', 'MCAR', 2, '240'), (1, 'MCAR', 'MCAR', 2, '240'), (1, 'MCAR', 'MCAR', 2, '20'), (1, 'MCAR', 'MCAR', 2, '20'), (1, 'MCAR', 'MCAR', 2, '20'), (1, 'MCAR', 'MCAR', 2, '20'), (1, 'MCAR', 'MCAR', 2, '20')  ... displaying 10 of 99 total bound parameter sets ...  (1, 'MONT', 'MONT', 2, '20'), (1, 'MONT', 'MONT', 2, '20'))]
(Background on this error at: http://sqlalche.me/e/14/gkpj)

relevant package version info:

pip3 list | egrep -i "pygtfs|sqlalchemy"
pygtfs     0.1.6
SQLAlchemy 1.4.20

Dtuch GTFS data seems to have different lay-out?

Hi,

Indirectly I'm trying to get your awesome piece of software to work as it was implemented in Home-Assistant :D I'm trying to add the GTFS data for The Netherlands, however, it seems to have some format issues. Could you confirm? The official data is here: http://gtfs.ovapi.nl/new/gtfs-nl.zip

The following error log pops up in home-assistant:

16-04-13 20:21:49 homeassistant.components.sensor: Error while setting up platform gtfs Traceback (most recent call last): File "/usr/local/lib/python3.4/dist-packages/homeassistant/helpers/entity_component.py", line 91, in _setup_platform discovery_info) File "/usr/local/lib/python3.4/dist-packages/homeassistant/components/sensor/gtfs.py", line 157, in setup_platform config["origin"], config["destination"])) File "/usr/local/lib/python3.4/dist-packages/homeassistant/components/sensor/gtfs.py", line 176, in __init__ self.update() File "/usr/local/lib/python3.4/dist-packages/homeassistant/components/sensor/gtfs.py", line 215, in update self._data_source)) File "/home/rob/.homeassistant/lib/pygtfs/loader.py", line 87, in append_feed instance = gtfs_class(feed_id=feed_id, **records_as_dict) File "<string>", line 4, in __init__ File "/home/rob/.homeassistant/lib/sqlalchemy/orm/state.py", line 306, in _initialize_instance manager.dispatch.init_failure(self, args, kwargs) File "/home/rob/.homeassistant/lib/sqlalchemy/util/langhelpers.py", line 60, in __exit__ compat.reraise(exc_type, exc_value, exc_tb) File "/home/rob/.homeassistant/lib/sqlalchemy/util/compat.py", line 184, in reraise raise value File "/home/rob/.homeassistant/lib/sqlalchemy/orm/state.py", line 303, in _initialize_instance return manager.original_init(*mixed[1:], **kwargs) File "/home/rob/.homeassistant/lib/sqlalchemy/ext/declarative/base.py", line 649, in _declarative_constructor setattr(self, k, kwargs[k]) File "/home/rob/.homeassistant/lib/sqlalchemy/orm/attributes.py", line 224, in __set__ instance_dict(instance), value, None) File "/home/rob/.homeassistant/lib/sqlalchemy/orm/attributes.py", line 701, in set value, old, initiator) File "/home/rob/.homeassistant/lib/sqlalchemy/orm/attributes.py", line 710, in fire_replace_event self._init_append_or_replace_token()) File "/home/rob/.homeassistant/lib/sqlalchemy/orm/util.py", line 118, in set_ return validator(state.obj(), key, value) File "/home/rob/.homeassistant/lib/pygtfs/gtfs_entities.py", line 57, in in_range raise ValueError("Empty value not allowed in {0}".format(key)) ValueError: Empty value not allowed in wheelchair_boarding

Thanks so much! ๐Ÿ‘

Rob

IDEA/FEATURE: allow (gtfs2db?) to add and/or delete based on date- (calendar or calendar_date)

Use case: datasources update with a certain frequency, from what I have seen this is anywhere between weekly and annually.
Esp. for the high frequent updates there are cases that the new source is already published before the current data expires and there is no overlap. When performing a nightly automated update one can have a situation where the new datasource can not yet be used as its first data is e.g. tomorrow.
Options (not exhaustive)

  • donot unpack if the new data contains only future dates
  • add new data to current database based on date, i.e. combine sources
  • delete from db based on date, this would also assist in removing past calendar (dates) and trips, etc. with the idea to keep the db small and performing

Issues parsing Tisseo (Toulouse, France) GTFS data

I have an issue parsing Tisseo GTFS files. Using home-assistant, here's the error I have:

Traceback (most recent call last):
  File "/home/kernald/home-assistant/homeassistant/helpers/entity_component.py", line 91, in _setup_platform
    discovery_info)
  File "/home/kernald/home-assistant/homeassistant/components/sensor/gtfs.py", line 157, in setup_platform
    config["origin"], config["destination"]))
  File "/home/kernald/home-assistant/homeassistant/components/sensor/gtfs.py", line 176, in __init__
    self.update()
  File "/home/kernald/home-assistant/homeassistant/components/sensor/gtfs.py", line 215, in update
    self._data_source))
  File "/home/kernald/.homeassistant/lib/pygtfs/loader.py", line 87, in append_feed
    instance = gtfs_class(feed_id=feed_id, **records_as_dict)
  File "<string>", line 4, in __init__
  File "/home/kernald/.homeassistant/lib/sqlalchemy/orm/state.py", line 306, in _initialize_instance
    manager.dispatch.init_failure(self, args, kwargs)
  File "/home/kernald/.homeassistant/lib/sqlalchemy/util/langhelpers.py", line 60, in __exit__
    compat.reraise(exc_type, exc_value, exc_tb)
  File "/home/kernald/.homeassistant/lib/sqlalchemy/util/compat.py", line 184, in reraise
    raise value
  File "/home/kernald/.homeassistant/lib/sqlalchemy/orm/state.py", line 303, in _initialize_instance
    return manager.original_init(*mixed[1:], **kwargs)
  File "/home/kernald/.homeassistant/lib/sqlalchemy/ext/declarative/base.py", line 649, in _declarative_constructor
    setattr(self, k, kwargs[k])
  File "/home/kernald/.homeassistant/lib/sqlalchemy/orm/attributes.py", line 224, in __set__
    instance_dict(instance), value, None)
  File "/home/kernald/.homeassistant/lib/sqlalchemy/orm/attributes.py", line 701, in set
    value, old, initiator)
  File "/home/kernald/.homeassistant/lib/sqlalchemy/orm/attributes.py", line 710, in fire_replace_event
    self._init_append_or_replace_token())
  File "/home/kernald/.homeassistant/lib/sqlalchemy/orm/util.py", line 118, in set_
    return validator(state.obj(), key, value)
  File "/home/kernald/.homeassistant/lib/pygtfs/gtfs_entities.py", line 57, in in_range
    raise ValueError("Empty value not allowed in {0}".format(key))
ValueError: Empty value not allowed in wheelchair_boarding

Relevant home-assistant ticket is here: home-assistant/core#1761

Ignore invalid feed columns

Some transit operators add non-standard feed columns (which don't exist in the Google Extensions or in the official specs) to their feeds. for example MBTA (this feed: http://www.mbta.com/uploadedfiles/MBTA_GTFS.zip) adds "route_sort_order" to routes.txt

This causes pygtfs to fail, because it doesn't know what to do with these values.

Failure while writing Routes(route_id='Blue', agency_id='1', route_short_name='', route_long_name='Blue Line', route_desc='Rapid Transit', route_type='1', route_url='', route_color='2F5DA6', route_text_color='FFFFFF', route_sort_order='1')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3.4/site-packages/pygtfs-0.1.2-py3.4.egg/pygtfs/loader.py", line 76, in append_feed
  File "<string>", line 4, in __init__
  File "/usr/lib64/python3.4/site-packages/sqlalchemy/orm/state.py", line 306, in _initialize_instance
    manager.dispatch.init_failure(self, args, kwargs)
  File "/usr/lib64/python3.4/site-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
    compat.reraise(exc_type, exc_value, exc_tb)
  File "/usr/lib64/python3.4/site-packages/sqlalchemy/util/compat.py", line 184, in reraise
    raise value
  File "/usr/lib64/python3.4/site-packages/sqlalchemy/orm/state.py", line 303, in _initialize_instance
    return manager.original_init(*mixed[1:], **kwargs)
  File "/usr/lib64/python3.4/site-packages/sqlalchemy/ext/declarative/base.py", line 648, in _declarative_constructor
    (k, cls_.__name__))
TypeError: 'route_sort_order' is an invalid keyword argument for Route

It'd be better if these kind of errors will be ignored, so pygtfs will be usable even with not-exactly-standard feeds

Issue with last version of SQAlchemy

I got that king of errors with the last version of SQalchemy

Warning (from warnings module):
File "C:\Python34\lib\site-packages\sqlalchemy\orm\relationships.py", line 2667
for (pr, fr_) in other_props)
SAWarning: relationship 'Trip.feed' will copy column _feed.feed_id to column trips.feed_id, which conflicts with relationship(s): 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id), 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation.

Can not load the zip file

Traceback (most recent call last):
  File "/home/mdminhaz/pycharm-community-2022.3.2/plugins/python-ce/helpers/pydev/pydevconsole.py", line 364, in runcode
    coro = func()
  File "<input>", line 1, in <module>
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib/python3.10/site-packages/pygtfs/loader.py", line 87, in append_feed
    instance = gtfs_class(feed_id=feed_id, **record._asdict())
  File "<string>", line 4, in __init__
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/state.py", line 575, in _initialize_instance
    with util.safe_reraise():
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 147, in __exit__
    raise exc_value.with_traceback(exc_tb)
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/state.py", line 573, in _initialize_instance
    manager.original_init(*mixed[1:], **kwargs)
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/decl_base.py", line 2076, in _declarative_constructor
    setattr(self, k, kwargs[k])
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/attributes.py", line 528, in __set__
    self.impl.set(
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/attributes.py", line 1261, in set
    value = self.fire_replace_event(
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/attributes.py", line 1276, in fire_replace_event
    value = fn(
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/events.py", line 2590, in wrap
    return fn(target, *arg)
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib64/python3.10/site-packages/sqlalchemy/orm/util.py", line 322, in set_
    return validator(state.obj(), key, value)
  File "/home/mdminhaz/PycharmProjects/gtfs-practice/venv/lib/python3.10/site-packages/pygtfs/gtfs_entities.py", line 86, in is_float_none
    return float(value)
TypeError: float() argument must be a string or a real number, not 'NoneType'

include requirements.txt

If you install this module in a virtual environment, and install its dependencies with pip, you can use

pip freeze > requirements.txt

Then, we users can install with

pip install -r requirements.txt

Makes deployment easier.

Error when converting gtfs.zip to postgresDB

I get the following error when running gtfs2db script

gtfs2db append gtfs.zip postgresql://localhost/gtfs_db

sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) there is no unique constraint matching given keys for referenced table "stops"
 [SQL: '\nCREATE TABLE translations (\n\tfeed_id INTEGER, \n\ttrans_id VARCHAR NOT NULL, \n\tlang VARCHAR NOT NULL, \n\ttranslation VARCHAR, \n\tPRIMARY KEY (trans_id, lang), \n\tFOREIGN KEY(feed_id, trans_id) REFERENCES stops (feed_id, stop_name), \n\tFOREIGN KEY(feed_id) REFERENCES _feed (feed_id)\n)\n\n']

Any idea what the problem might be?

Modify Feed

What are your thoughts about enabling modifications to an imported feed? For my purposes, it would be nice to issue a command like "delete(routes, route_id=24)" or "delete(stop_times, route_id=15") and have the relevant information removed. It would also be nice to have an interface for adding information.

If there's a more efficient way to accomplish this without writing custom code within pygtfs I'd love to hear about it.

Looking at Stop Times by ID

Is there a way to look at stop_times through a function like: stop_times_by_id()? Ex. - I want to answer a question like what are the stop times for the 33rd street stop?

Looking through schedule.py - Is there an issue since stop_times has an underscore or is already plural while the other objects are singular and one word?

For now I am just running a basic loop in Python. Continuing with the earlier example, I am gathering every stop_times object with a stop_id that is equivalent to stops_by_id(33). I hope to hear your thoughts. Thank you in advance!

varchar mysql errors

trying to import a GTFS file to mysql using the following and getting a ton of VARCHAR length unspecified errors from sqlalchemy. what am i doing wrong?

import pygtfs
sched = pygtfs.Schedule('mysql://gtfs:gtfs@localhost:5432/gtfs_njt_rail')
pygtfs.append_feed(sched, "/home/anthony/code/pi_transitsign/data/gtfsdata/njt_rail_data")

here's the output

python2 import_rail_mysql.py

Traceback (most recent call last):
File "import_rail_mysql.py", line 17, in
sched = pygtfs.Schedule('mysql://gtfs:gtfs@localhost:5432/gtfs_njt_rail')
File "/usr/lib/python2.7/site-packages/pygtfs/schedule.py", line 36, in init
Base.metadata.create_all(self.engine)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/schema.py", line 3695, in create_all
tables=tables)
File "/usr/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1856, in _run_visitor
conn._run_visitor(visitorcallable, element, *_kwargs)
File "/usr/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1481, in _run_visitor
*_kwargs).traverse_single(element)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/visitors.py", line 121, in traverse_single
return meth(obj, *_kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 730, in visit_metadata
_is_metadata_operation=True)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/visitors.py", line 121, in traverse_single
return meth(obj, *_kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 764, in visit_table
include_foreign_key_constraints=include_foreign_key_constraints
File "/usr/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 914, in execute
return meth(self, multiparams, params)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 68, in _execute_on_connection
return connection._execute_ddl(self, multiparams, params)
File "/usr/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 962, in _execute_ddl
compiled = ddl.compile(dialect=dialect)
File "", line 1, in
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/elements.py", line 494, in compile
return self._compiler(dialect, bind=bind, *_kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 26, in _compiler
return dialect.ddl_compiler(dialect, self, *_kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 190, in init
self.string = self.process(self.statement, *_compile_kwargs)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 213, in process
return obj._compiler_dispatch(self, *_kwargs)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/visitors.py", line 81, in _compiler_dispatch
return meth(self, *_kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 2170, in visit_create_table
(table.description, column.name, ce.args[0])
File "/usr/lib/python2.7/site-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 2159, in visit_create_table
and not first_pk)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 213, in process
return obj._compiler_dispatch(self, *_kwargs)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/visitors.py", line 81, in compiler_dispatch
return meth(self, **kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 2190, in visit_create_column
first_pk=first_pk
File "/usr/lib/python2.7/site-packages/sqlalchemy/dialects/mysql/base.py", line 1961, in get_column_specification
column.type, type_expression=column)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 261, in process
return type
._compiler_dispatch(self, *_kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/visitors.py", line 81, in _compiler_dispatch
return meth(self, *kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/sql/compiler.py", line 2594, in visit_unicode
return self.visit_VARCHAR(type
, **kw)
File "/usr/lib/python2.7/site-packages/sqlalchemy/dialects/mysql/base.py", line 2313, in visit_VARCHAR
self.dialect.name)
sqlalchemy.exc.CompileError: (in table '_feed', column 'feed_name'): VARCHAR requires a length on dialect mysql

Drop Python 2 support

At this point, there's little reason to maintain backward compatibility with Python 2.
Moving forward, we should take advantage of some things that came with Python 3.

gtfs2db on BC Transit GTFS ERROR

when trying to build the database using gtfs2db append <feed_file> with this zip file as the feed (http://www.gtfs-data-exchange.com/agency/bc-transit-victoria-regional-transit-system/latest.zip)... this happens.

Don't know if you intend to update any of this code, but here's something you might not have known.

C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Trip.service' will copy column calendar.feed_id to column trips.feed_id, which conflicts with relationship(s): 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id), 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Service.trips' will copy column calendar.feed_id to column trips.feed_id, which conflicts with relationship(s): 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id), 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Trip.route' will copy column routes.feed_id to column trips.feed_id, which conflicts with relationship(s): 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id), 'Service.trips' (copies calendar.feed_id to trips.feed_id), 'Trip.service' (copies calendar.feed_id to trips.feed_id), 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Route.trips' will copy column routes.feed_id to column trips.feed_id, which conflicts with relationship(s): 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id), 'Service.trips' (copies calendar.feed_id to trips.feed_id), 'Trip.service' (copies calendar.feed_id to trips.feed_id), 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'StopTime.stop' will copy column stops.feed_id to column stop_times.feed_id, which conflicts with relationship(s): 'Trip.stop_times' (copies trips.feed_id to stop_times.feed_id), 'StopTime.trip' (copies trips.feed_id to stop_times.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Stop.stop_times' will copy column stops.feed_id to column stop_times.feed_id, which conflicts with relationship(s): 'Trip.stop_times' (copies trips.feed_id to stop_times.feed_id), 'StopTime.trip' (copies trips.feed_id to stop_times.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Route.feed' will copy column _feed.feed_id to column routes.feed_id, which conflicts with relationship(s): 'Route.agency' (copies agency.feed_id to routes.feed_id), 'Agency.routes' (copies agency.feed_id to routes.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Feed.routes' will copy column _feed.feed_id to column routes.feed_id, which conflicts with relationship(s): 'Route.agency' (copies agency.feed_id to routes.feed_id), 'Agency.routes' (copies agency.feed_id to routes.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Trip.feed' will copy column _feed.feed_id to column trips.feed_id, which conflicts with relationship(s): 'Service.trips' (copies calendar.feed_id to trips.feed_id), 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id), 'Route.trips' (copies routes.feed_id to trips.feed_id), 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id), 'Trip.route' (copies routes.feed_id to trips.feed_id), 'Trip.service' (copies calendar.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Feed.trips' will copy column _feed.feed_id to column trips.feed_id, which conflicts with relationship(s): 'Service.trips' (copies calendar.feed_id to trips.feed_id), 'ShapePoint.trips' (copies shapes.feed_id to trips.feed_id), 'Route.trips' (copies routes.feed_id to trips.feed_id), 'Trip.shape_points' (copies shapes.feed_id to trips.feed_id), 'Trip.route' (copies routes.feed_id to trips.feed_id), 'Trip.service' (copies calendar.feed_id to trips.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'StopTime.feed' will copy column _feed.feed_id to column stop_times.feed_id, which conflicts with relationship(s): 'Trip.stop_times' (copies trips.feed_id to stop_times.feed_id), 'StopTime.stop' (copies stops.feed_id to stop_times.feed_id), 'Stop.stop_times' (copies stops.feed_id to stop_times.feed_id), 'StopTime.trip' (copies trips.feed_id to stop_times.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Feed.stop_times' will copy column _feed.feed_id to column stop_times.feed_id, which conflicts with relationship(s): 'Trip.stop_times' (copies trips.feed_id to stop_times.feed_id), 'StopTime.stop' (copies stops.feed_id to stop_times.feed_id), 'Stop.stop_times' (copies stops.feed_id to stop_times.feed_id), 'StopTime.trip' (copies trips.feed_id to stop_times.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'FareRule.feed' will copy column _feed.feed_id to column fare_rules.feed_id, which conflicts with relationship(s): 'Route.fare_rules' (copies routes.feed_id to fare_rules.feed_id), 'FareRule.route' (copies routes.feed_id to fare_rules.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Feed.fare_rules' will copy column _feed.feed_id to column fare_rules.feed_id, which conflicts with relationship(s): 'Route.fare_rules' (copies routes.feed_id to fare_rules.feed_id), 'FareRule.route' (copies routes.feed_id to fare_rules.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Frequency.feed' will copy column _feed.feed_id to column frequencies.feed_id, which conflicts with relationship(s): 'Trip.frequencies' (copies trips.feed_id to frequencies.feed_id), 'Frequency.trip' (copies trips.feed_id to frequencies.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) C:\Python27\lib\site-packages\sqlalchemy\orm\relationships.py:2667: SAWarning: relationship 'Feed.frequencies' will copy column _feed.feed_id to column frequencies.feed_id, which conflicts with relationship(s): 'Trip.frequencies' (copies trips.feed_id to frequencies.feed_id), 'Frequency.trip' (copies trips.feed_id to frequencies.feed_id). Consider applying viewonly=True to read-only relationships, or provide a primaryjoin condition marking writable columns with the foreign() annotation. for (pr, fr_) in other_props) Traceback (most recent call last): File "C:\Python27\Scripts\gtfs2db-script.py", line 9, in load_entry_point('pygtfs==0.1.2', 'console_scripts', 'gtfs2db')() File "build\bdist.win-amd64\egg\pygtfs\gtfs2db.py", line 54, in main File "build\bdist.win-amd64\egg\pygtfs\loader.py", line 76, in append_feed File "", line 4, in **init** File "C:\Python27\lib\site-packages\sqlalchemy\orm\state.py", line 306, in _initialize_instance manager.dispatch.init_failure(self, args, kwargs) File "C:\Python27\lib\site-packages\sqlalchemy\util\langhelpers.py", line 60, in __exit__ compat.reraise(exc_type, exc_value, exc_tb) File "C:\Python27\lib\site-packages\sqlalchemy\orm\state.py", line 303, in _initialize_instance return manager.original_init(_mixed[1:], *_kwargs) File "C:\Python27\lib\site-packages\sqlalchemy\ext\declarative\base.py", line 649, in _declarative_constructor (k, cls_.**name**)) TypeError: 'stop_short_name' is an invalid keyword argument for Stop

In feeds where calendar.txt is missing, pygtfs creates entities that occur on "no day of the week"

According to https://gtfs.org/schedule/reference/#calendar_datestxt it's OK to omit calendar.txt when publishing a GTFS feed:

Alternate: Omit calendar.txt, and specify each date of service in calendar_dates.txt. This allows for considerable service variation and accommodates service without normal weekly schedules. In this case service_id is an ID.

However, pygtfs if it encounters such a feed, it'll fill the calendar and calendar_dates SQLite tables in such a way that every date occurs on "no day of the week" - every monday,tuesday,thursday and so on are "0" in the SQLite table. The code responsible for this is here: https://github.com/jarondl/pygtfs/blob/master/pygtfs/loader.py#L97

An example feed that - when loaded into a SQLite database with pygtfs - will cause the described issue is here: https://komunikacja.bialystok.pl/cms/File/download/gtfs/google_transit.zip

(Another one is https://eu.ftp.opendatasoft.com/sncf/gtfs/transilien-gtfs.zip - it has calendar.txt, but many calendar_dates.txt with service exceptions are making plenty of entries in "calendar" table where all days of the week are 0) - taken from home-assistant/core#68614

Example SQLite query:

SELECT * from calendar, calendar_dates WHERE calendar.start_date = calendar_dates.date
feed_id service_id monday tuesday wednesday thursday friday saturday sunday start_date end_date feed_id service_id date exception_type
1 P_852 0 0 0 0 0 0 0 2023-10-27 2023-10-27 1 P_852 2023-10-27 1
1 R_852 0 0 0 0 0 0 0 2023-10-28 2023-10-28 1 R_852 2023-10-28 1
1 S_852 0 0 0 0 0 0 0 2023-10-29 2023-10-29 1 S_852 2023-10-29 1
1 S_862 0 0 0 0 0 0 0 2023-11-01 2023-11-01 1 S_862 2023-11-01 1
1 P_863 0 0 0 0 0 0 0 2023-11-02 2023-11-02 1 P_863 2023-11-02 1
1 R_863 0 0 0 0 0 0 0 2023-11-04 2023-11-04 1 R_863 2023-11-04 1
1 S_863 0 0 0 0 0 0 0 2023-11-05 2023-11-05 1 S_863 2023-11-05 1

Would it be possible to just set the day of the week based on the ServiceException date, rather than filling the entity with all 0?


In case you're wondering how I came across this, I noticed how home-assistant/core#68614 - an issue related to the HomeAssistant gtfs widget - https://www.home-assistant.io/integrations/gtfs/ - that displays "when's the closest departure" somewhere on a tablet in your smart home.

The issue is that the sqlite db would get generated properly, but the widget would just always fail to find any departure.

This is because the widget's code relies on the following SQL query to find departures:

click to open
        SELECT trip.trip_id, trip.route_id,
               time(origin_stop_time.arrival_time) AS origin_arrival_time,
               time(origin_stop_time.departure_time) AS origin_depart_time,
               date(origin_stop_time.departure_time) AS origin_depart_date,
               origin_stop_time.drop_off_type AS origin_drop_off_type,
               origin_stop_time.pickup_type AS origin_pickup_type,
               origin_stop_time.shape_dist_traveled AS origin_dist_traveled,
               origin_stop_time.stop_headsign AS origin_stop_headsign,
               origin_stop_time.stop_sequence AS origin_stop_sequence,
               origin_stop_time.timepoint AS origin_stop_timepoint,
               time(destination_stop_time.arrival_time) AS dest_arrival_time,
               time(destination_stop_time.departure_time) AS dest_depart_time,
               destination_stop_time.drop_off_type AS dest_drop_off_type,
               destination_stop_time.pickup_type AS dest_pickup_type,
               destination_stop_time.shape_dist_traveled AS dest_dist_traveled,
               destination_stop_time.stop_headsign AS dest_stop_headsign,
               destination_stop_time.stop_sequence AS dest_stop_sequence,
               destination_stop_time.timepoint AS dest_stop_timepoint,
               calendar.thursday AS yesterday,
               calendar.friday AS today,
               
               calendar.start_date AS start_date,
               calendar.end_date AS end_date
        FROM trips trip
        INNER JOIN calendar calendar
                   ON trip.service_id = calendar.service_id
         INNER JOIN stop_times origin_stop_time
                    ON trip.trip_id = origin_stop_time.trip_id
        INNER JOIN stops start_station
                   ON origin_stop_time.stop_id = start_station.stop_id
        INNER JOIN stop_times destination_stop_time
                   ON trip.trip_id = destination_stop_time.trip_id
        INNER JOIN stops end_station
                   ON destination_stop_time.stop_id = end_station.stop_id
         WHERE (calendar.thursday = 1
                OR calendar.friday= 1
                
                )
        AND start_station.stop_id = "047"
                   AND end_station.stop_id = "303"
        AND origin_stop_sequence < dest_stop_sequence
        AND calendar.start_date <= "2023-11-02"
        AND calendar.end_date >= "2023-11-02"
        ORDER BY calendar.sunday DESC,
                 calendar.monday DESC,
                 
                 origin_stop_time.departure_time
        LIMIT 172800

https://github.com/home-assistant/core/blob/dev/homeassistant/components/gtfs/sensor.py#L295-L343

The problematic part is the:

WHERE (calendar.thursday = 1
                OR calendar.friday= 1
                
                )

since all days of the week are 0, this condition will never match and always return 0 rows.

Add making queries to documentation

Sorry if I missed this (and I didn't find a mailing list). Looping through entities is very convenient, but I'd like to specify queries for performance. Trying to make a query did not work as expected:

Stop = pygtfs.gtfs_entities.Stop
stops = schedule.session.query(Stop).all()

[...]

ArgumentError: Column-based expression object expected for argument 'order_by'; got: 'stop_sequence', type <type 'str'>     

Is there support for specifying SQLalchemy queries is pygtfs? How do I use it?

Thanks.
Elliot

Issue when using the library with Python 3.3

Hi,

I'm having an issue when using the library with my Python 3.3 setup.

Traceback (most recent call last):
  File "/home/antonin/PycharmProjects/transit-heatmap/transit-heatmap.py", line 6, in <module>
    pygtfs.append_feed(sched, "data/RATP_GTFS_FULL_25-09-2013.zip")
  File "/usr/local/lib/python3.3/dist-packages/pygtfs/loader.py", line 45, in append_feed
    fd = feed.Feed(feed_filename, strip_fields)
  File "/usr/local/lib/python3.3/dist-packages/pygtfs/feed.py", line 48, in __init__
    if six.PY2:
AttributeError: 'module' object has no attribute 'PY2'

When looking at my six module's source, I can see that PY2 is indeed not defined in it, only PY3:

# True if we are running on Python 3.
PY3 = sys.version_info[0] == 3

Maybe this should be checked with getattr, or inside a try;except?

Getting validation error during the bootstrap for `location_type` not recognized

I'm getting the following error:

Logger: homeassistant.components.sensor
Source: components/gtfs/sensor.py:511
Integration: Sensor (documentation, issues)
First occurred: 7:28:50 PM (1 occurrences)
Last logged: 7:28:50 PM

Error while setting up gtfs platform for sensor
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 231, in _async_setup_platform
    await asyncio.shield(task)
  File "/usr/local/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/src/homeassistant/homeassistant/components/gtfs/sensor.py", line 511, in setup_platform
    pygtfs.append_feed(gtfs, os.path.join(gtfs_dir, data))
  File "/usr/local/lib/python3.8/site-packages/pygtfs/loader.py", line 81, in append_feed
    instance = gtfs_class(feed_id=feed_id, **record._asdict())
  File "<string>", line 4, in __init__
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/state.py", line 445, in _initialize_instance
    manager.dispatch.init_failure(self, args, kwargs)
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
    compat.raise_(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
    raise exception
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/state.py", line 442, in _initialize_instance
    return manager.original_init(*mixed[1:], **kwargs)
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/decl_base.py", line 1145, in _declarative_constructor
    setattr(self, k, kwargs[k])
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/attributes.py", line 458, in __set__
    self.impl.set(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/attributes.py", line 1070, in set
    value = self.fire_replace_event(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/attributes.py", line 1078, in fire_replace_event
    value = fn(
  File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/util.py", line 185, in set_
    return validator(state.obj(), key, value)
  File "/usr/local/lib/python3.8/site-packages/pygtfs/gtfs_entities.py", line 65, in in_range
    raise PygtfsValidationError(
pygtfs.exceptions.PygtfsValidationError: location_type must be in range [None, 0, 1, 2], was 3

because the following GTFS file http://data.pid.cz/PID_GTFS.zip has location_type set to 3.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.