We’re going to revisit our example of bearded seal movement along the northwest Alaska coast. And, we’re going to walk through this demo relying on the helper functions included within the crawlUtils package.
library(crawl)
crawl 2.3.0 (2022-05-13)
Demos and documentation can be found at our new GitHub repository:
https://dsjohnson.github.io/crawl_examples/
library(crawlUtils)
crawlUtils 0.1.02 (2022-06-23)
library(pathroutr)library(dplyr)
Attaching package: 'dplyr'
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
library(purrr)
Attaching package: 'purrr'
The following object is masked from 'package:crawl':
flatten
library(ggplot2)library(colorspace)library(sf)
Linking to GEOS 3.10.2, GDAL 3.4.2, PROJ 8.2.1; sf_use_s2() is TRUE
Sourcing Land/Barrier Data
The first thing we need to do is source the relevant land (or other barrier) polygon data for our study area. The crawlUtils package has a built-in function for downloading a global coastline data file based on the OpenStreetMap data. This is a relatively large file so the cu_download_osm() function downloads a local copy for you. The initial download is likely between 750 MG and 1 GB of data.
crawlUtils::cu_download_osm()# > This function will download a considerable amount of coastline data.# > Are you sure you want to proceed? [y/n]: y
We, obviously, don’t need the entire global coastline for our study. So, we will want to crop the downloaded data to our study area. But, it’s important that we provide a sensible buffer to fully capture the available land. There’s no exact science for this value, but can be especially important in smaller areas with complicated coastline. FOr this example, we’ll set the buffer to 100km.