Git Product home page Git Product logo

modelled_stream_crossings's Introduction

modelled_stream_crossings

** moved to https://github.com/smnorris/bcfishpass/tree/main/01_prep/01_modelled_stream_crossings **

Generate potential locations of road/railway stream crossings and associated structures in British Columbia.

In addition to generating the intersection points of roads/railways and streams, this tool attempts to:

  • remove duplicate crossings
  • identify crossings that are likely to be bridges/open bottom structures
  • maintain a consistent unique identifier value (modelled_crossing_id) that is stable with script re-runs

Methods and sources

Stream features

Streams are taken from the BC Freshwater Atlas and loaded using fwapg.

Linear transportation features

Road and railway features are downloaded from DataBC. Features used to generate stream crossings are defined by the queries below. The queries attempt to extract transportation features only at locations where there is likely to be a stream crossing structure.

Source Query
Digital Road Atlas (DRA) transport_line_type_code NOT IN ('F','FP','FR','T','TR','TS','RP','RWA') AND transport_line_surface_code != 'D' AND transport_line_structure_code != 'T'
Forest Tenure Roads life_cycle_status_code NOT IN ('RETIRED', 'PENDING')
OGC Road Segment Permits status = 'Approved' AND road_type_desc != 'Snow Ice Road'
OGC Development Roads pre-2006 petrlm_development_road_type != 'WINT'
NRN Railway Tracks structure_type != 'Tunnel' (via spatial join to gba_railway_structure_lines_sp)

Overlay and de-duplication

Intersection points of the stream and road features are created. Because we want to generate locations of individual structures, crossings (on the same stream) are de-duplicated using several data source specific tolerances:

  • merge DRA crossings on freeways/highways with a 30m tolerance
  • merge DRA crossings on arterial/collector road with a 20m tolerance
  • merge other types of DRA crossings within a 12.5m tolerance
  • merge FTEN crossings within a 12.5m tolerance
  • merge OGC crossings within a 12.5m tolerance
  • merge railway crossings within a 20m tolerance

DRA crossings are also merged across streams at a tolerance of 10m.

After same-source data crossings are merged, all crossings are de-duplicated using a 10m tolerance across all (road) data sources (railway crossings are not merged with the road sources). The actual location of an output crossing corresponds to the location from the highest priority dataset - in this order of decreasing priority: DRA, FTEN, OGC permits, OGC permits pre2006. Despite the duplicate removals, the unique identifier for each source road within 10m of a crossing is retained - all crossings can be linked back to their various source road datasets.

Bridges / Open Bottom Structures

Once duplicates have been removed, output crossings are identified/modelled as open bottom structures via these properties:

Source Query output modelled_crossing_type_source
Stream order stream_order >= 6 FWA_STREAM_ORDER
Rivers/double line streams edge_type IN (1200, 1250, 1300, 1350, 1400, 1450, 1475) FWA_EDGE_TYPE
MOT structures bmis_structure_type = 'BRIDGE' MOT_ROAD_STRUCTURE_SP
PSCIS structures current_crossing_type_code = 'OBS' PSCIS
DRA structures transport_line_structure_code IN ('B','C','E','F','O','R','V') TRANSPORT_LINE_STRUCTURE_CODE
Railway structures structure_type LIKE 'BRIDGE%' GBA_RAILWAY_STRUCTURE_LINES_SP

Requirements

  • a FWA database created by fwapg
  • PostgreSQL (requires >= v12)
  • PostGIS (tested with v3.0.2)
  • GDAL (tested with v3.1.2)
  • Python (>=3.6)
  • bcdata
  • pscis
  • wget, unzip, parallel

Installation

The repository is a collection of sql files and shell (bash) scripts - no installation is required.

To get the latest:

git clone https://github.com/smnorris/modelled_stream_crossings.git
cd modelled_stream_crossings

Run the scripts

Scripts presume that environment variables PGHOST, PGUSER, PGDATABASE, PGPORT, DATABASE_URL are set to the appropriate db and that password authentication for the database is not required.

If required, download the data:

$ ./01_load.sh

Run the overlays and produce the outputs:

$ ./02_process.sh

Output

Output table is fish_passage.modelled_stream_crossings:

                                                            Table "fish_passage.modelled_stream_crossings"
            Column             |          Type          | Collation | Nullable |                                       Default
-------------------------------+------------------------+-----------+----------+--------------------------------------------------------------------------------------
 modelled_crossing_id          | integer                |           | not null | nextval('fish_passage.modelled_stream_crossings_modelled_crossing_id_seq'::regclass)
 modelled_crossing_type        | character varying(5)   |           |          |
 modelled_crossing_type_source | text[]                 |           |          |
 transport_line_id             | integer                |           |          |
 ften_road_section_line_id     | text                   |           |          |
 og_road_segment_permit_id     | integer                |           |          |
 og_petrlm_dev_rd_pre06_pub_id | integer                |           |          |
 railway_track_id              | integer                |           |          |
 linear_feature_id             | bigint                 |           |          |
 blue_line_key                 | integer                |           |          |
 downstream_route_measure      | double precision       |           |          |
 wscode_ltree                  | ltree                  |           |          |
 localcode_ltree               | ltree                  |           |          |
 watershed_group_code          | character varying(4)   |           |          |
 geom                          | geometry(PointZM,3005) |           |          |
Indexes:
    "modelled_stream_crossings_pkey" PRIMARY KEY, btree (modelled_crossing_id)
    "modelled_stream_crossings_blue_line_key_idx" btree (blue_line_key)
    "modelled_stream_crossings_ften_road_section_line_id_idx" btree (ften_road_section_line_id)
    "modelled_stream_crossings_geom_idx" gist (geom)
    "modelled_stream_crossings_linear_feature_id_idx" btree (linear_feature_id)
    "modelled_stream_crossings_og_petrlm_dev_rd_pre06_pub_id_idx" btree (og_petrlm_dev_rd_pre06_pub_id)
    "modelled_stream_crossings_og_road_segment_permit_id_idx" btree (og_road_segment_permit_id)
    "modelled_stream_crossings_railway_track_id_idx" btree (railway_track_id)
    "modelled_stream_crossings_transport_line_id_idx" btree (transport_line_id)

See basic summary stats for the current release in the reports folder. Date of source data downloads is noted in CHANGES.txt and an extract of the current version is available here.

Optional

The script completely re-runs the analysis and new data is created each time it is run - modelled_crossing_id will not be consistent between runs.

If retaining existing modelled_crossing_id values is required, modify and use the script sql/09_match_archived_crossings.sql to match ids against an existing table. Matching is done by finding nearest neighbouring points within 10m.

For published output, this matching script has been run to match modelled_crossing_id to crossing_id values from v.2.3.1 of the Fish Passage Technical Working Group model.

modelled_stream_crossings's People

Contributors

smnorris avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

modelled_stream_crossings's Issues

port existing code

From fish passage model, extract code/logic that:

  • finds intersection points
  • removes duplicates within a set threshold (currently 10m) within the same source
  • removes duplicates between sources, retaining crossing from the higher priority source
  • labels bridges
  • match output to existing output layer and transfer existing unique ids to new layer

This can probably all be rolled into a small number of sql files.
Note that DRA has changed, update to use transport_lines table

improve labelling of bridges/open bottom structures

Currently bridges are identified via:

  • double line stream crossings
  • TRIM FCODE

Improve by adding additional bridge data sources such as:

We could also use PSCIS assessments https://catalogue.data.gov.bc.ca/dataset/pscis-assessments, but that would probably be better taken from the output of https://github.com/smnorris/pscis (after the PSCIS crossings are indexed to streams)

It is probably also worth looking at a stream_order threshold for modelling open bottom structures. Currently for fish passage modelling in HORS,LNIC,BULK,ELKR watershed groups we use a threshold of stream order 5 or less for CBS (with one manually QA'ed exception) https://github.com/smnorris/cwf_fishpassage/blob/master/03_wsg_prioritize/sql/01_create_barriers_structures.sql#L95

do not convert measures to integer

Integer measures would be tidier but:

  • +/- .5m isn't very much but it does look odd when reviewing the output compared to the road lines
  • it looks like my math is off, some measures are off a road by almost a metre
  • this was only applied for DRA lines

additional attributes from sources?

I'm not sure this is necessary (attributes can be added in later steps) but these could all be added:

  • stream name
  • road name
  • road type
  • road surface
  • road operator

Like this, ordering COALESCE inputs based on which source we prefer for each:

SELECT
 x.modelled_crossing_id,
 s.gnis_name as stream_name,
 COALESCE(dra.structured_name_1, ften.road_section_name, ogc.road_number||';'||ogc.proponent, ogc06.petrlm_development_road_name, rail.operator_english_name) as road_name,
 COALESCE(ften.file_type_description, dratype.description, ogc.road_type_desc, ogc06.petrlm_development_road_type, rail.transport_type) as road_type,
 drasurf.description as road_surface,
 COALESCE(ften.client_name, ogc.proponent, ogc06.proponent, operator_english_name) as road_operator
FROM fish_passage.modelled_stream_crossings x
INNER JOIN whse_basemapping.fwa_stream_networks_sp s 
ON x.linear_feature_id = s.linear_feature_id
LEFT OUTER JOIN whse_basemapping.transport_line dra
ON x.transport_line_id = dra.transport_line_id
LEFT OUTER JOIN whse_basemapping.transport_line_type_code dratype
ON dra.transport_line_type_code = dratype.transport_line_type_code
LEFT OUTER JOIN whse_basemapping.transport_line_surface_code drasurf
ON dra.transport_line_surface_code = drasurf.transport_line_surface_code
LEFT OUTER JOIN whse_forest_tenure.ften_road_section_lines_svw ften
ON x.ften_road_section_lines_id = ften.id
LEFT OUTER JOIN whse_mineral_tenure.og_road_segment_permit_sp ogc
ON x.og_road_segment_permit_id = ogc.og_road_segment_permit_id
LEFT OUTER JOIN whse_mineral_tenure.og_petrlm_dev_rds_pre06_pub_sp ogc06
ON x.og_petrlm_dev_rd_pre06_pub_id = ogc06.og_petrlm_dev_rd_pre06_pub_id
LEFT OUTER JOIN WHSE_BASEMAPPING.GBA_RAILWAY_TRACKS_SP rail
ON x.railway_track_id = rail.railway_track_id

removing duplicate crossings - retain DRA crossings as first priority

In the existing model, when there are duplicate crossings (with 20m) FTEN crossings will take priority over DRA. This is simply because we want the tenure information for fish passage assessment planning/prioritization.

However, DRA data is going to be more positionally accurate and is now mostly comprehensive.
We should probably retain the DRA crossing geometry and extract attributes from the nearby FTEN road.

create crossings with a trigger

DRA features are timestamped. It should be possible create crossing records via a TRIGGER that is fired when a DRA record is inserted/modified. This removes the need to re-run the whole job every time there is a DRA update, and as a bonus, the existing crossing IDs won't be changed. Out of scope for now, and I'm not sure how this would handle deleting crossings when road features been removed/significantly shifted.

better elimination of duplicates

When identifying duplicates, ST_Cluster may work better in some circumstances than randomly picking a point from the identified duplicates. When combined with road type (eg divided highways), this could really improve some results.

retain crossings on DRA trails?

DRA trails (transport_line_type_code IN ('T','TR','TD','TS'), currently 44,580km in BC) are not currently considered.

The majority of these features are in the northeast (related to oil and gas exploration), where there is unlikely to be a crossing structure.

However, there are many trails in other areas that do have associated crossing structures.

It might be worth keeping the trail crossings in the modelled_stream_crossings table, filtering them out depending on requirements.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.