Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pre commit configuration (original, reworking) #28

Open
wants to merge 13 commits into
base: master
Choose a base branch
from
8 changes: 8 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
[flake8]
# Exceptions below are ignored because their solution was not clear to the author of the PR,
# but they should be solved and taken out from the ignore.
ignore = E203, E501, W503, B950, F821, B007, E402, E722, F401, F811, B001, B008, C901, E731, E231, B009, B303, E731, B903, B011
max-line-length = 100
max-complexity = 18
select = A,B,C,E,F,W,T4,B9
exclude = .git,__pycache__,misc
30 changes: 30 additions & 0 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
stages:
- test

linting:
stage: test
image: python:3.5-slim
before_script:
- apt-get update
- apt-get install -y git
- pip3 install -U pre-commit==1.21.0
script:
- pre-commit run --all-files

test:
stage: test
image: python:3.5
variables:
PYTHONPATH: $PWD:$PYTHONPATH
script:
- ./setup.py test

security:
stage: test
image: python:3.5-slim
allow_failure: true
before_script:
- pip install .
- pip install safety
script:
- safety check
24 changes: 24 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
repos:
- repo: https://github.com/ambv/black
rev: 19.10b0
hooks:
- id: black
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.4.0
hooks:
- id: flake8
exclude: misc
additional_dependencies: [
'flake8==3.7.9',
'flake8-builtins==1.4.2',
'flake8-bugbear==20.1.2',
]
- repo: https://github.com/pycqa/bandit
rev: 1.6.2
hooks:
- id: bandit
args: [-lll]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.761
hooks:
- id: mypy
24 changes: 21 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,34 @@ language: python
python:
- "3.8"

# command to install dependencies
install: "pip install Cython && pip install . pre-commit"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be because install_requires happens only at install time, but cython basically needs to be a dev requirement? Should we make a requirements-dev.txt file which has cython and nose in it? That's what I have done in other projects. Then the first half becomes pip install -r requirements-dev.txt

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Applied in the new PR.


jobs:
include:

- python: 3.6
stage: test

- python: 3.7
stage: test

- python: 3.8
stage: test

install: "pip install ."
# command to run tests
services:
- xvfb
before_script: # configure a headless display for testing plot generation
- "export DISPLAY=:99.0"

script: nosetests .
script:
- pre-commit run --all-files
- nosetests .

cache:
directories:
- $HOME/.cache/pre-commit
- $HOME/.cache/pip

notifications:
email:
Expand Down
20 changes: 8 additions & 12 deletions examples/example_export.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,24 +10,21 @@


# get elementary bus events (connections) taking place within a given time interval:
all_events = networks.temporal_network(g,
start_time_ut=start_ut,
end_time_ut=end_ut
)
all_events = networks.temporal_network(g, start_time_ut=start_ut, end_time_ut=end_ut)
print("Number of elementary PT events during rush hour in Kuopio: ", len(all_events))

# get elementary bus events (connections) taking place within a given time interval:
tram_events = networks.temporal_network(g,
start_time_ut=start_ut,
end_time_ut=end_ut,
route_type=route_types.TRAM
)
assert(len(tram_events) == 0) # there should be no trams in our example city (Kuopio, Finland)
tram_events = networks.temporal_network(
g, start_time_ut=start_ut, end_time_ut=end_ut, route_type=route_types.TRAM
)
assert len(tram_events) == 0 # there should be no trams in our example city (Kuopio, Finland)

# construct a networkx graph
print("\nConstructing a combined stop_to_stop_network")

graph = networks.combined_stop_to_stop_transit_network(g, start_time_ut=start_ut, end_time_ut=end_ut)
graph = networks.combined_stop_to_stop_transit_network(
g, start_time_ut=start_ut, end_time_ut=end_ut
)
print("Number of edges: ", len(graph.edges()))
print("Number of nodes: ", len(graph.nodes()))
print("Example edge: ", list(graph.edges(data=True))[0])
Expand All @@ -37,4 +34,3 @@
#################################################
# See also other functions in gtfspy.networks ! #
#################################################

16 changes: 12 additions & 4 deletions examples/example_filter.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,16 +16,24 @@
# filter by time and 3 kilometers from the city center
week_start = G.get_weekly_extract_start_date()
week_end = week_start + datetime.timedelta(days=7)
fe = FilterExtract(G, filtered_database_path, start_date=week_start, end_date=week_end,
buffer_lat=62.8930796, buffer_lon=27.6671316, buffer_distance_km=3)
fe = FilterExtract(
G,
filtered_database_path,
start_date=week_start,
end_date=week_end,
buffer_lat=62.8930796,
buffer_lon=27.6671316,
buffer_distance_km=3,
)

fe.create_filtered_copy()
assert (os.path.exists(filtered_database_path))
assert os.path.exists(filtered_database_path)

G = GTFS(filtered_database_path)

# visualize the routes of the filtered database
from gtfspy import mapviz
from matplotlib import pyplot as plt

mapviz.plot_route_network_from_gtfs(G)
plt.show()
plt.show()
25 changes: 17 additions & 8 deletions examples/example_import.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,25 +7,31 @@

def load_or_import_example_gtfs(verbose=False):
imported_database_path = "test_db_kuopio.sqlite"
if not os.path.exists(imported_database_path): # reimport only if the imported database does not already exist
if not os.path.exists(
imported_database_path
): # reimport only if the imported database does not already exist
print("Importing gtfs zip file")
import_gtfs.import_gtfs(["data/gtfs_kuopio_finland.zip"], # input: list of GTFS zip files (or directories)
imported_database_path, # output: where to create the new sqlite3 database
print_progress=verbose, # whether to print progress when importing data
location_name="Kuopio")
import_gtfs.import_gtfs(
["data/gtfs_kuopio_finland.zip"], # input: list of GTFS zip files (or directories)
imported_database_path, # output: where to create the new sqlite3 database
print_progress=verbose, # whether to print progress when importing data
location_name="Kuopio",
)

# Not this is an optional step, which is not necessary for many things.
print("Computing walking paths using OSM")
G = gtfs.GTFS(imported_database_path)
G.meta['download_date'] = "2017-03-15"
G.meta["download_date"] = "2017-03-15"

osm_path = "data/kuopio_extract_mapzen_2017_03_15.osm.pbf"

# when using with the Kuopio test data set,
# this should raise a warning due to no nearby OSM nodes for one of the stops.
osm_transfers.add_walk_distances_to_db_python(imported_database_path, osm_path)

print("Note: for large cities we have also a faster option for computing footpaths that uses Java.)")
print(
"Note: for large cities we have also a faster option for computing footpaths that uses Java.)"
)
dir_path = os.path.dirname(os.path.realpath(__file__))
java_path = os.path.join(dir_path, "../java_routing/")
print("Please see the contents of " + java_path + " for more details.")
Expand All @@ -35,7 +41,10 @@ def load_or_import_example_gtfs(verbose=False):

if verbose:
print("Location name:" + G.get_location_name()) # should print Kuopio
print("Time span of the data in unixtime: " + str(G.get_approximate_schedule_time_span_in_ut()))
print(
"Time span of the data in unixtime: "
+ str(G.get_approximate_schedule_time_span_in_ut())
)
# prints the time span in unix time
return G

Expand Down
2 changes: 1 addition & 1 deletion examples/example_map_visualization.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,4 @@

# ax_thumbnail.figure.savefig("test_thumbnail.jpg")

plt.show()
plt.show()
21 changes: 10 additions & 11 deletions examples/example_plot_trip_counts.py
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
import functools
import os

from example_import import load_or_import_example_gtfs
from matplotlib import pyplot as plt

from gtfspy.gtfs import GTFS

G = load_or_import_example_gtfs()

daily_trip_counts = G.get_trip_counts_per_day()
f, ax = plt.subplots()

datetimes = [date.to_pydatetime() for date in daily_trip_counts['date']]
trip_counts = daily_trip_counts['trip_counts']
datetimes = [date.to_pydatetime() for date in daily_trip_counts["date"]]
trip_counts = daily_trip_counts["trip_counts"]

ax.bar(datetimes, trip_counts)
ax.axvline(G.meta['download_date'], color="red")
ax.axvline(G.meta["download_date"], color="red")
threshold = 0.96
ax.axhline(trip_counts.max() * threshold, color="red")
ax.axvline(G.get_weekly_extract_start_date(weekdays_at_least_of_max=threshold), color="yellow")
Expand All @@ -24,18 +24,17 @@
G = GTFS(weekly_db_path)
f, ax = plt.subplots()
daily_trip_counts = G.get_trip_counts_per_day()
datetimes = [date.to_pydatetime() for date in daily_trip_counts['date']]
trip_counts = daily_trip_counts['trip_counts']
datetimes = [date.to_pydatetime() for date in daily_trip_counts["date"]]
trip_counts = daily_trip_counts["trip_counts"]
ax.bar(datetimes, trip_counts)

events = list(G.generate_routable_transit_events(0, G.get_approximate_schedule_time_span_in_ut()[0]))
min_ut = float('inf')
events = list(
G.generate_routable_transit_events(0, G.get_approximate_schedule_time_span_in_ut()[0])
)
min_ut = float("inf")
for e in events:
min_ut = min(e.dep_time_ut, min_ut)

print(G.get_approximate_schedule_time_span_in_ut())

plt.show()



58 changes: 32 additions & 26 deletions examples/example_temporal_distance_profile.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@

import example_import
from gtfspy.routing.helpers import get_transit_connections, get_walk_network
from gtfspy.routing.multi_objective_pseudo_connection_scan_profiler import MultiObjectivePseudoCSAProfiler
from gtfspy.routing.multi_objective_pseudo_connection_scan_profiler import (
MultiObjectivePseudoCSAProfiler,
)
from gtfspy.routing.node_profile_analyzer_time_and_veh_legs import NodeProfileAnalyzerTimeAndVehLegs

G = example_import.load_or_import_example_gtfs()
Expand All @@ -14,12 +16,12 @@
to_stop_I = None
stop_dict = G.stops().to_dict("index")
for stop_I, data in stop_dict.items():
if data['name'] == from_stop_name:
if data["name"] == from_stop_name:
from_stop_I = stop_I
if data['name'] == to_stop_name:
if data["name"] == to_stop_name:
to_stop_I = stop_I
assert (from_stop_I is not None)
assert (to_stop_I is not None)
assert from_stop_I is not None
assert to_stop_I is not None

# The start and end times between which PT operations (and footpaths) are scanned:
ANALYSIS_START_TIME_UT = G.get_suitable_date_for_daily_extract(ut=True) + 10 * 3600
Expand All @@ -40,17 +42,18 @@
# gtfspy.osm_transfers.add_walk_distances_to_db_python(..., cutoff_distance_m=2000).



mpCSA = MultiObjectivePseudoCSAProfiler(connections,
targets=[to_stop_I],
start_time_ut=CONNECTION_SCAN_START_TIME_UT,
end_time_ut=CONNECTION_SCAN_END_TIME_UT,
transfer_margin=120, # seconds
walk_network=walk_network,
walk_speed=1.5, # meters per second
verbose=True,
track_vehicle_legs=True,
track_time=True)
mpCSA = MultiObjectivePseudoCSAProfiler(
connections,
targets=[to_stop_I],
start_time_ut=CONNECTION_SCAN_START_TIME_UT,
end_time_ut=CONNECTION_SCAN_END_TIME_UT,
transfer_margin=120, # seconds
walk_network=walk_network,
walk_speed=1.5, # meters per second
verbose=True,
track_vehicle_legs=True,
track_time=True,
)

mpCSA.run()
profiles = mpCSA.stop_profiles
Expand All @@ -60,19 +63,21 @@
direct_walk_duration = departure_stop_profile.get_walk_to_target_duration()
# This equals inf, if walking distance between the departure_stop (from_stop_I) and target_stop (to_stop_I)
# is longer than MAX_WALK_LENGTH
analyzer = NodeProfileAnalyzerTimeAndVehLegs(departure_stop_profile.get_final_optimal_labels(),
direct_walk_duration,
ANALYSIS_START_TIME_UT,
ANALYSIS_END_TIME_UT)
analyzer = NodeProfileAnalyzerTimeAndVehLegs(
departure_stop_profile.get_final_optimal_labels(),
direct_walk_duration,
ANALYSIS_START_TIME_UT,
ANALYSIS_END_TIME_UT,
)

# Print out results:
stop_dict = G.stops().to_dict("index")
print("Origin: ", stop_dict[from_stop_I])
print("Destination: ", stop_dict[to_stop_I])
print("Minimum temporal distance: ", analyzer.min_temporal_distance() / 60., " minutes")
print("Mean temporal distance: ", analyzer.mean_temporal_distance() / 60., " minutes")
print("Medan temporal distance: ", analyzer.median_temporal_distance() / 60., " minutes")
print("Maximum temporal distance: ", analyzer.max_temporal_distance() / 60., " minutes")
print("Minimum temporal distance: ", analyzer.min_temporal_distance() / 60.0, " minutes")
print("Mean temporal distance: ", analyzer.mean_temporal_distance() / 60.0, " minutes")
print("Medan temporal distance: ", analyzer.median_temporal_distance() / 60.0, " minutes")
print("Maximum temporal distance: ", analyzer.max_temporal_distance() / 60.0, " minutes")
# Note that the mean and max temporal distances have the value of `direct_walk_duration`,
# if there are no journey alternatives departing after (or at the same time as) `ANALYSIS_END_TIME_UT`.
# Thus, if you obtain a float('inf') value for some of the temporal distance measures, it could probably be
Expand All @@ -85,8 +90,9 @@

# use tex in plotting
rc("text", usetex=True)
fig1 = analyzer.plot_new_transfer_temporal_distance_profile(timezone=timezone_pytz,
format_string="%H:%M")
fig1 = analyzer.plot_new_transfer_temporal_distance_profile(
timezone=timezone_pytz, format_string="%H:%M"
)
fig2 = analyzer.plot_temporal_distance_pdf_horizontal(use_minutes=True)
print("Showing...")
plt.show()
Loading