';
+ }
+ resultdiv.append(searchitem);
+ }
+ });
+});
diff --git a/assets/js/lunr/lunr-store.js b/assets/js/lunr/lunr-store.js
new file mode 100644
index 00000000..d2fd04e6
--- /dev/null
+++ b/assets/js/lunr/lunr-store.js
@@ -0,0 +1,85 @@
+var store = [
+ {
+ "title": "FAQ and Known Issues: ",
+ "excerpt": "Frequently Asked Questions (FAQ) If you have somehow stumbled here first, note that there are specifications, demos, and tutorials which expand upon much of the information here. These can be found, among other places, from the home page. Connectivity What is the distance from the camera that BLE will still work? It is standard Bluetooth 4.0 range and it depends on external factors such as: Interference: anything interfering with the signal will shorten the range. The type of device that the camera is connected to: BT classification distinguishes 3 device classes based on their power levels. Depending on the class of the connected device, the range varies from less than 10 meters to 100 meters. Can I connect using WiFi only? Theoretically yes, if you already know the SSID, password, and the camera’s WiFi AP has been enabled. However, practically no because BLE is required in order to discover this information and configure the AP. Can I connect using BLE only? Yes, however there is some functionality that is not possible over BLE such as accessing the media list and downloading files. How to allow third-party devices to automatically discover close-by GoPro cameras? Devices can only be discovered via BLE by scanning for advertising GoPro cameras Multi Camera Setups How many devices can connect to the camera? Simultaneously, only one device can connect at a time. However, the camera stores BLE security keys and other connection information so it is possible to connect multiple devices sequentially. Is there currently a way to connect multiple cameras on the same Wifi network? No. Cameras can only be connected through Wi-Fi by becoming an access point itself (generating its own Wi-Fi network), not as a peripheral. What is the time offset between multiple cameras executing the same command? In cases when camera sync is important, we recommend using the USB connection, which minimizes the variance among devices. The time drift among cameras connected by USB cable to the same host will be up to ~35ms. Using BLE for that purpose will further increase it. Is there a way to precisely time sync cameras so the footage can be aligned during post-processing? The cameras set their time via GPS. By default, the camera seeks GPS at the beginning of every session, but this can be hindered by typical limitations of GPS signals. Additionally, there are two advanced options that require GoPro Labs firmware installed on the camera. The best bet is multi-cam GPS sync. Another option is precise time calibration via a dynamic QR scan from a smartphone or PC. Streaming What are the differences between the streaming options for GoPros? There are currently 3 different options on how to stream out of GoPro cameras. They are available either via Wi-Fi, USB, or both. : Wifi : : USB : ViewFinder Preview LiveStream ViewFinder Preview Webcam Preview Webcam Orientation Floating or Fixed Landscape facing up Floating or Fixed Landscape: Floating or Fixed Landscape: Floating or Fixed Streaming Protocols UDP (MPEG-TS) RTMP UDP (MPEG-TS) UDP (MPEG-TS) UDP (MPEG-TS) \\ RTSP RTSP Connection Protocol Wifi - AP Mode WiFi - STA Mode NCM NCM NCM Resolution 480p, 720p 480p, 720p, 1080p 480p, 720p 720p, 1080p 720p, 1080p Frame Rate 30 30 30 30 30 Bitrate 2.5 - 4 mbps 0.8 - 8 mbps 2.5 - 4 mbps 6 mbps 6 mbps \\ depending on model configurable depending on model Stabilization Basic Stabilization HyperSmooth or none Basic Stabilization None None Image Quality Basic Same as recorded content Basic Basic Same as recorded content Minimum Latency 210 ms > 100ms (un-stabilized) 210 ms 210 ms 210 ms \\ > 1,100ms (stabilized) Audio None Stereo None None None Max Stream Time 150 minutes (720p on fully 85 minutes (720p on fully Unlimited (with external Unlimited(with external Unlimited (with external\\ charged Enduro battery) charged Enduro battery) power via USB) power via USB) power via USB How to achieve low latency feed streaming onto displays? The stream has a minimum latency of about 210ms. If you are seeing latency higher than that, we often find that as a result of using off-the-shelf libraries like ffmpeg which adds its own buffering. For preview and framing use cases, we don’t recommend using the live streaming RTMP protocol because it adds unnecessary steps in the pipeline, and puts the camera in the streaming preset, which offers little other control. A low latency streaming demo is available in the demos. How do I minimize latency of video preview with FFPLAY? FFPLAY by default will buffer remote streams. You can minimize the buffer using: --no-cache (receiving side) `-fflags nobuffer” (sender). However, the best latency can be achieved by building your own pipeline or ffmpegs library for decoding the bytes. Users should be able to achieve about 200-300 ms latency on the desktop and possibly optimize lower than that. How to view the video stream in VLC? To view the stream in VLC, you need to open network stream udp://@0.0.0.0:8554. You will still see latency because VLC uses its own caching. Power What are the power requirements for GoPro if connected over USB? All cameras have minimum power requirements, as specified here. As long as the power is supplied, cameras will be fully functional with or without an internal battery. Removing the battery and running on USB power alone will improve thermal performance and runtime. If you are seeing issues with insufficient power and have a charger with correct specs, the problems likely stem from low quality cables or low-quality adapters that are not able to consistently provide advertised amperage. We have encountered USB-C cables manufactured with poor quality control that transfer enough power only when their connectors face one side up, but not the other. We recommend using only high-quality components that deliver the correct power output How to enable automatic power on and off in wired setups? Cameras cannot be switched on remotely over USB, or “woken up” remotely after they “go to sleep”. The best workaround for this is via the GoPro Labs firmware that forces the camera to automatically switch on as soon as it detects USB power and switch off when the powering stops. Refer to the WAKE command here. Metadata Can I use the GPS track from the camera in real time? No. The GPS track on the camera as well as other metadata is not available until the file is written and saved. If the objective is to add metadata to the stream, currently the only option is to pull GPS data from another device (phone, wearable,… ) and sync it to the video feed. What can be accessed in the metadata file? Metadata exists as a proprietary GPMF (GoPro Metadata Format) and can be extracted from the file via API commands separately for GPS, Telemetry data, or the entire metadata container. The following data points can be extracted: Camera settings (Exposure time, ISO, Sensor Gain, White balance) Date and Time IMU: GPS, gyroscope, and accelerometer Smile detection Audio levels Face detection in bounding boxes Scene Classifiers (water, urban, vegetation, snow, beach, indoor) Is there a way to change the file names or otherwise classify my video file? Currently there are two options to do that, and both require GoPro Labs firmware. The stock firmware doesn’t provide that option. With GoPro Labs installed, you can either inject metadata into the file (and extract it later with the GPMF parser) or use custom naming for the file. Is there a way to add time stamps to the video files and mark specific moments? Open GoPro users can add time stamped markers, called “Hilights”, to flag specific moments in the video. Hilights can be injected into the video in the real time and then extracted for analytics or other post-processing purposes. The same Hilights are used in GoPro’s auto-editing engine Quik to determine the most interesting moments in the video. General Which cameras are supported by Open GoPro? The answer at a high level is >= Hero 9. However, there are also certain firmware requirements. For a complete answer, see the Specification. How to get the remaining timelapse capability? First check the value of Setting 128. Then depending on whether this is Photo or Video, use: Status 34 (Remaining photos) Status 35 (Remaining videos) Camera Logic Do commands operate as priority-wise or time-related? The cameras use first-in, first-out logic. Is there an option to send the commands in cyclic format instead of sending requests for each command? If you want to receive information asynchronously, it is possible via registering for BLE notifications. See an example (tracking battery) in the Python SDK. Troubleshooting If you are able to consistently reproduce a problem, please file a bug on Github Issues Why is the camera not advertising? If you have not yet paired to the camera with the desired device, then you need to first set the camera into pairing mode (Connections->Connect Device->Quick App). If you have already paired, then the camera should be advertising and ready to connect. If it is not advertising, it is possible you are already connected to it from a previous session. To be sure, power cycle both the camera and the peer device. Workaround for intermittent Wifi AP Connection failure On >= Hero 11, try disabling and then re-enabling the camera’s Wifi AP using the AP Control BLE Command Known Issues Relevant to All Supported Cameras Webcam does not enter idle mode once plugged in The webcam status will be wrongly reported as IDLE instead of OFF after a new USB connection. The best workaround for this is to call Webcam Start followed by Webcam Stop after connecting USB in order to make the webcam truly IDLE and thus willing to accept setting changes. Intermittent failure to connect to the cameras Wifi Access Point On rare occasions, connections to the camera’s Wifi AP will continuously fail until the camera is reset. It is possible to workaround this as described in Troubleshooting Spurious Protobuf Notifications sent once camera is connected in Station mode Once the camera has been connected in station mode (STA), it will start sending protobuf notifications with action ID 0xFF. These should be ignored. Hero 11 (v01.10.00) Specific Wired Communication is broken after update mode This is fixed by Resetting Connections and then re-pairing.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/faq#"
+ },
+ {
+ "title": "Open GoPro: ",
+ "excerpt": "Getting Started Open GoPro is an API for interacting with GoPro cameras that is developed and managed directly by GoPro. It provides developers and companies with an easy-to-use programming interface and helps integrate the cameras into their ecosystems. The API works with off-the-shelf cameras with standard firmware, is free-to-use under MIT license, and publicly available online on GitHub. Docs Detailed Bluetooth Low Energy (BLE) and HTTP Interface Specifications. BLE Spec→ HTTP Spec → Tutorials Walk-through tutorials in different languages / frameworks for getting started. ✏️ Tutorials → Demos Complete runnable examples in different languages to use as base for your project. ⚙️ Demos → Compatibility Open GoPro API is supported on all camera models since Hero 9 with the following firmware version requirements: Camera Minimum FW Version Hero 12 Black v01.10.00 Hero 11 Black Mini v01.10.00 Hero 11 Black v01.10.00 Hero 10 Black v01.10.00 Hero 9 Black v01.70.00 While we strive to provide the same API functionality and logic for all newly launched cameras, minor changes are sometimes necessary. These are typically a consequence of HW component upgrades or improving or optimizing FW architecture. Therefore support for some functions and commands might be model-specific. This is described in the compatibility tables in the documentation. Interfaces Users can interact with the camera over BLE, WiFi, or wired USB. Both Wifi and USB are operated through HTTP server with identical commands. It is important to note that due to hard constraints of power and the hardware design, some commands in Wi-Fi are not available in BLE, and vice-versa. Bluetooth Low Energy (BLE) BLE is the fastest way to control the cameras and allow command and control functionality. BLE advertising is used for initial camera pairing. BLE is a requirement for any type of Wireless control since camera WiFi must be enabled upon each connection via BLE. WiFi WiFi needs to be switched on by a BLE command. Besides command & control, Wi-Fi also allows for video streaming and media manipulation. With the exception of live-streaming, the camera always acts as an Wi-Fi access point that other devices need to connect to. For more information, see the HTTP Specification. USB The USB connection can provide both data transfer and power to the camera. The power provided by the USB is sufficient for the camera to run indefinitely without the internal battery. However, the wired connection doesn’t allow for programmatic power on and off. Camera needs to be switched on manually or via BLE, and after the camera goes to sleep, it must be “woken up” again with a button press or via BLE. For monitoring and other use cases where the camera must be operated and switched on and off only via USB cable, there is a workaround with the Labs firmware - more detail in the FAQ. Control Camera and Record Remotely Here is a summary of currently supported per-interface features: Feature BLE WiFi USB Retrieve Camera State ✔️ ✔️ ✔️ Change Settings / Mode ✔️ ✔️ ✔️ Encode (Press shutter) ✔️ ✔️ ✔️ Stream Video ✔️ ✔️ Media Management ✔️ ✔️ Metadata Extraction ✔️ ✔️ Camera Connect / Wake ✔️ BLE, WiFi, and USB can be used to change settings and modes, start and stop capture, query remaining battery life, SD card capacity, or camera status (such as “is it recording?”). Most command-and-control functionality is disabled while the camera is recording video or is otherwise busy. There are 3 main recording modes for the cameras: photo, video, and timelapse. Within each mode, one can choose different frame rates, resolutions and FOV options. Note that not all cameras have all 3 recording modes, not all settings combinations are available for all camera models. The specification section links to json and xls files that show all available setting combinations per camera model. Stream Video Besides recording, the cameras can also stream video feed. The API provides 3 different ways to stream videos directly from the cameras, either via USB or wireless connection. Stream Type Description WiFi USB Record while Streaming : Preview : Moderate video quality, primarily for framing `>=` Hero 12 \\ Stream Low latency stabilization ✔️ ✔️ \\ Low power consumption : Webcam : Cinematic video quality \\ Mode Optional low latency stabilization ✔️ ✔️ : Live : Cinematic video quality \\ Stream Optional hypersmooth stabilization ✔️ ✔️ Each of the streaming types has different resolutions, bit rates, imaging pipelines, and different levels of configurability. Refer to the FAQ. Manipulate and Transfer Media Files When the camera records video, time lapse, or photo, the media is saved on the SD card with filenames according to the GoPro File Naming Convention. The cameras always record two versions of each video file Full resolution based on the selected settings (.mp4) Low resolution video proxy (.lrv) that can be used for editing or other operations where file size matters. The lrv file type can be renamed to mp4 and used for playback or further editing. Both versions exist in the same folder on the SD card. In addition, the cameras generate a thumbnail image (.thm) for each media file. The thm file type can be changed to jpeg if required. All three file types can be accessed, deleted, or copied via API commands. Extract Metadata GoPro cameras write metadata in each file, using a proprietary GPMF format. The metadata contains information including gyroscope, accelerometer, GPS, imaging-specific metadata, and several computed metrics such as scene classification. The metadata file cannot be edited or read while the camera is recording but can be accessed after the file has been written either entirely or selectively for a specific metric such as GPS. Use Multiple Cameras Simultaneously Controlling multiple cameras from one client is supported via BLE, Wifi, and USB with varying functionality depending on the protocol used. Refer to the table below. Protocol Available Functionality Notes BLE Command and control of several cameras Each camera can connect only to one BLE-enabled device at a time WiFi Command and control in Wi-fi station mode (COHN) COHN is available only from HERO12 onwards \\ Webcam and Preview Stream in Wi-fi station mode (COHN) Live-streaming (RTMP) RTMP stream must be initiated via BLE USB Command and control and streaming via Webcam mode Available only from HERO11 onward Use GoPro Cloud and Editing Engine The GoPro ecosystem includes a multitude of ways to edit, store, and replay content which are currently available for end-users as a part of paid subscription programs. Even when integrated into your ecosystem, GoPro cameras can take advantage of cloud backup and editing tools provided by GoPro including auto-upload to the cloud, automatic editing, and native live streaming. The GoPro cloud interface has been tailored to the needs of individual consumers. If you are interested in commercial usage, reach out to our business development team.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/#"
+ },
+ {
+ "title": "Tutorials: ",
+ "excerpt": "This set of tutorials is a series of sample scripts / files and accompanying .html walk-throughs to implement basic functionality to interact with a GoPro device using the following languages: - Python - Kotlin - More to come! The tutorials only support Open GoPro Version 2.0 and must be run on a [supported camera](/ble/index.htmlsupported-cameras). They will provide walk-throughs and sample code to use the relevant language / framework to exercise the Open GoPro Interface using Bluetooth Low Energy (BLE) and HTTP over WiFi. {% warning %} The tutorials are only tested on the latest camera / firmware combination. This is only an issue in cases where capabilities change between cameras such as setting options. {% endwarning %} The tutorials are meant as an introduction to the Open GoPro specification. They are not a substitute for the complete [BLE](/ble/index.html) and [HTTP](/http) specifications which will be your main source of reference after completing the tutorials. {% for tutorial in site.tutorials %} - [{{ tutorial.title }}]({{ tutorial.permalink | prepend: site.baseurl }}) {% endfor %}",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/#"
+ },
+ {
+ "title": "Tutorial 1: Connect BLE: ",
+ "excerpt": "This tutorial will provide a walk-through to connect to the GoPro camera via Bluetooth Low Energy (BLE). Requirements Hardware A GoPro camera that is supported by Open GoPro python kotlin One of the following systems: Windows 10, version 16299 (Fall Creators Update) or greater Linux distribution with BlueZ >= 5.43 OS X/macOS support via Core Bluetooth API, from at least OS X version 10.11 An Android Device supporting SDK >= 33 Software python kotlin Python >= 3.9 and < 3.12 must be installed. See this Python installation guide. Android Studio >= 2022.1.1 (Electric Eel) Overview / Assumptions python kotlin This tutorial will use bleak to control the OS’s Bluetooth Low Energy (BLE). The Bleak BLE controller does not currently support autonomous pairing for the BlueZ backend. So if you are using BlueZ (i.e. Ubuntu, RaspberryPi, etc.), you need to first pair the camera from the command line as shown in the BlueZ tutorial. The bleak module is based on asyncio which means that its awaitable functions need to be called from an async coroutine. In order to do this, all of the code below should be running in an async function. We accomplish this in the tutorial scripts by making main async as such: import asyncio async def main() -> None: Put our code here if __name__ == \"__main__\": asyncio.run(main()) These are stripped down Python tutorials that are only meant to show the basics. For a complete Python SDK that uses bleak as the backend as well as a cross-platform WiFi backend to easily write Python apps that control the GoPro, see the Open GoPro Python SDK This tutorial will provide a set of Kotlin tutorials to demonstrate Open GoPro Functionality. The tutorials are provided as a single Android Studio project targeted to run on an Android device. The tutorials are only concerned with application-level demonstrations of the Open GoPro API and therefore do not prioritize the following: UI: The tutorial project only contains a minimal UI to select, implement, and view logs for each tutorial Android architecture / best practices: the project architecture is designed to encapsulate Kotlin functionality to easily display per-tutorial functionality Android-specific requirements: permission handling, adapter enabling, etc are implemented in the project but not documented in the tutorials BLE / Wifi (HTTP) functionality: A simple BLE API is included in the project and will be touched upon in the tutorials. However, the focus of the tutorials is not on how the BLE API is implemented as a real project would likely use a third-party library for this such as Kable See the Punchthrough tutorials for Android BLE-Specific tutorials These tutorials assume familiarity and a base level of competence with: Android Studio Bluetooth Low Energy JSON HTTP Setup python kotlin This set of tutorials is accompanied by a Python package consisting of scripts separated by tutorial module. These can be found on Github. Once the Github repo has been cloned or downloaded to your local machine, the package can be installed as follows: Enter the python tutorials directory at $INSTALL/demos/python/tutorial/ where $INSTALL is the top level of the Open GoPro repo where it exists on your local machine Use pip to install the package (in editable mode in case you want to test out some changes): pip install -e . While it is out of the scope of this tutorial to describe, it is recommended to install the package in to a virtual environment in order to isolate system dependencies. You can test that installation was successful by viewing the installed package’s information: $ pip show open-gopro-python-tutorials Name: open-gopro-python-tutorials Version: 0.0.3 Summary: Open GoPro Python Tutorials Home-page: https://github.com/gopro/OpenGoPro Author: Tim Camise Author-email: gopro.com License: MIT Location: c:\\users\\tim\\gopro\\opengopro\\demos\\python\\tutorial Requires: bleak, requests Required-by: This set of tutorials is accompanied by an Android Studio project consisting of, among other project infrastructure, Kotlin files separated by tutorial module. The project can be found on Github. Once the Github repo has been cloned or downloaded to your local machine, open the project in Android studio. At this point you should be able to build and load the project to your Android device. The project will not work on an emulated device since BLE can not be emulated. Just Show me the Demo!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 1 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements You can test connecting to your camera through BLE using the following script: python ble_connect.py See the help for parameter definitions: $ python ble_connect.py --help usage: ble_connect.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, pair, then enable notifications. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 1” from the dropdown and click on “Perform.” Perform Tutorial 1 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Basic BLE Tutorial This tutorial will walk through the process of connecting to a GoPro via BLE. This same connect functionality will be used as a foundation for all future BLE tutorials. Here is a summary of the sequence that will be described in detail in the following sections: GoProOpen GoPro user deviceGoProOpen GoPro user deviceScanningConnectedalt[If not Previously Paired]PairedReady to CommunicateAdvertisingAdvertisingConnectPair RequestPair ResponseEnable Notifications on Characteristic 1Enable Notifications on Characteristic 2Enable Notifications on Characteristic ..Enable Notifications on Characteristic N Advertise First, we need to ensure the camera is discoverable (i.e. it is advertising). Follow the per-camera steps here. The screen should appear as such: Camera is discoverable. Scan Next, we must scan to discover the advertising GoPro Camera. python kotlin We will do this using bleak. Let’s initialize an empty dict that will store discovered devices, indexed by name: Map of devices indexed by name devices: Dict[str, BleakDevice] = {} We’re then going to scan for all devices. We are passing a scan callback to bleak in order to also find non-connectable scan responses. We are keeping any devices that have a device name. Scan callback to also catch nonconnectable scan responses def _scan_callback(device: BleakDevice, _: Any) -> None: Add to the dict if not unknown if device.name and device.name != \"Unknown\": devices[device.name] = device Now discover and add connectable advertisements for device in await BleakScanner.discover(timeout=5, detection_callback=_scan_callback): if device.name != \"Unknown\" and device.name is not None: devices[device.name] = device Now we can search through the discovered devices to see if we found a GoPro. Any GoPro device name will be structured as GoPro XXXX where XXXX is the last four digits of your camera’s serial number. If you have renamed your GoPro to something other than the default, you will need to update the below steps accordingly. First, we define a regex which is either “GoPro “ followed by any four alphanumeric characters if no identifier was passed, or the identifier if it exists. In the demo ble_connect.py, the identifier is taken from the command-line arguments. token = re.compile(identifier or r\"GoPro [A-Z0-9]{4}\") Now we build a list of matched devices by checking if each device’s name includes the token regex. matched_devices: List[BleakDevice] = [] Now look for our matching device(s) matched_devices = [device for name, device in devices.items() if token.match(name)] Due to potential RF interference and the asynchronous nature of BLE advertising / scanning, it is possible that the advertising GoPro will not be discovered by the scanning PC in one scan. Therefore, you may need to redo the scan (as ble_connect.py does) until a GoPro is found. That is, matched_device must contain at least one device. Similarly, connection establishment can fail for reasons out of our control. Therefore, the connection process is also wrapped in retry logic. Here is an example of the log from ble_connect.py of scanning for devices. Note that this includes several rescans until the devices was found. $ python ble_connect.py INFO:root:Scanning for bluetooth devices... INFO:root: Discovered: INFO:root: Discovered: TR8600 seri INFO:root:Found 0 matching devices. INFO:root: Discovered: INFO:root: Discovered: TR8600 seri INFO:root: Discovered: GoPro Cam INFO:root: Discovered: GoPro 0456 INFO:root:Found 1 matching devices. Among other devices, you should see GoPro XXXX where XXXX is the last four digits of your camera’s serial number. First let’s define a filter that will be used to find GoPro device advertisements. We do this by filtering on the GoPro Service UUID that is included in all GoPro advertisements: private val scanFilters = listOf<ScanFilter>( ScanFilter.Builder().setServiceUuid(ParcelUuid.fromString(GOPRO_UUID)).build() ) We then send this to the BLE API and collect events from the SharedFlow that it returns. We take the first event emitted from this SharedFlow and notify (via a Channel) that a GoPro advertiser has been found, store the GoPro’s BLE address, and stop the scan. ble.startScan(scanFilters).onSuccess { scanResults -> val deviceChannel: Channel<BluetoothDevice> = Channel() // Collect scan results CoroutineScope(Dispatchers.IO).launch { scanResults.collect { scanResult -> // We will take the first discovered gopro deviceChannel.send(scanResult.device) } } // Wait to receive the scan result goproAddress = deviceChannel.receive().address ble.stopScan(scanResults) } At this point, the GoPro’s BLE address is stored (as a string) in goproAddress. Here is an example log output from this process: Scanning for GoPro's Received scan result: GoPro 0992 Found GoPro: GoPro 0992 Connect Now that we have discovered at least one GoPro device to connect to, the next step is to establish a BLE connection to the camera. python kotlin We're just taking the first device if there are multiple. device = matched_devices[0] client = BleakClient(device) await client.connect(timeout=15) An example output of this is shown here where we can see that the connection has successfully been established as well as the GoPro’s BLE MAC address.: INFO:root:Establishing BLE connection to EF:5A:F6:13:E6:5A: GoPro 0456... INFO:bleak.backends.dotnet.client:Services resolved for BleakClientDotNet (EF:5A:F6:13:E6:5A) INFO:root:BLE Connected! ble.connect(goproAddress) At this point, the BLE connection is established but there is more setup to be done before we are ready to communicate. Pair The GoPro has encryption-protected characteristics which require us to pair before writing to them. Therefore now that we are connected, we need to attempt to pair. python kotlin try: await client.pair() except NotImplementedError: This is expected on Mac pass Not all OS’s allow pairing (at this time) but some require it. Rather than checking for the OS, we are just catching the exception when it fails. Rather than explicitly request pairing, we rely on the fact that Android will automatically start the pairing process if you try to read a characteristic that requires encryption. To do this, we read the Wifi AP Password characteristic. First we discover all characteristics (this will also be needed later when enabling notifications): ble.discoverCharacteristics(goproAddress) This API will discover the characteristics over-the-air but not return them here. They are stored to the ble object for later access via the servicesOf method. Then we read the relevant characteristic to trigger pairing: ble.readCharacteristic(goproAddress, GoProUUID.WIFI_AP_PASSWORD.uuid) At this point a pairing popup should occur on the Android Device. Select “Allow Pairing” to continue. Here is an example log output from this process: Discovering characteristics Discovered 9 services for F7:5B:5D:81:64:1B Service 00001801-0000-1000-8000-00805f9b34fb Characteristics: |-- Service 00001800-0000-1000-8000-00805f9b34fb Characteristics: |--00002a00-0000-1000-8000-00805f9b34fb: READABLE |--00002a01-0000-1000-8000-00805f9b34fb: READABLE |--00002a04-0000-1000-8000-00805f9b34fb: READABLE ... |------00002902-0000-1000-8000-00805f9b34fb: EMPTY |--b5f90082-aa8d-11e3-9046-0002a5d5c51b: WRITABLE |--b5f90083-aa8d-11e3-9046-0002a5d5c51b: NOTIFIABLE |------00002902-0000-1000-8000-00805f9b34fb: EMPTY |--b5f90084-aa8d-11e3-9046-0002a5d5c51b: NOTIFIABLE |------00002902-0000-1000-8000-00805f9b34fb: EMPTY Service 00001804-0000-1000-8000-00805f9b34fb Characteristics: |--00002a07-0000-1000-8000-00805f9b34fb: READABLE Pairing Read characteristic b5f90003-aa8d-11e3-9046-0002a5d5c51b : value: 66:3F:54:2D:38:35:72:2D:4E:35:63 Once paired, the camera should beep and display “Connection Successful”. This pairing process only needs to be done once. On subsequent connections, the devices will automatically re-establish encryption using stored keys. That is, they are “bonded.” Enable Notifications As specified in the Open GoPRo BLE Spec, we must enable notifications for a given characteristic to receive responses from it. To enable notifications, we loop over each characteristic in each service and enable the characteristic for notification if it has notify properties: python kotlin It is necessary to define a notification handler to pass to the bleak start_notify method. Since we only care about connecting to the device in this tutorial (and not actually receiving data), we are just passing an empty function. A future tutorial will demonstrate how to use this meaningfully. for service in client.services: for char in service.characteristics: if \"notify\" in char.properties: await client.start_notify(char, notification_handler) In the following example output, we can see that notifications are enabled for each characteristic that is notifiable. INFO:root:Enabling notifications... INFO:root:Enabling notification on char 00002a19-0000-1000-8000-00805f9b34fb INFO:root:Enabling notification on char b5f90073-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90075-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90077-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90079-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90092-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90081-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90083-aa8d-11e3-9046-0002a5d5c51b INFO:root:Enabling notification on char b5f90084-aa8d-11e3-9046-0002a5d5c51b INFO:root:Done enabling notifications INFO:root:BLE Connection is ready for communication. ble.servicesOf(goproAddress).onSuccess { services -> services.forEach { service -> service.characteristics.forEach { char -> if (char.isNotifiable()) { ble.enableNotification(goproAddress, char.uuid) } } } } Here is an example log output from this process: Enabling notifications Enabling notifications for 00002a19-0000-1000-8000-00805f9b34fb Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90073-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90075-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90077-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90079-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90092-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90081-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90083-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Enabling notifications for b5f90084-aa8d-11e3-9046-0002a5d5c51b Wrote to descriptor 00002902-0000-1000-8000-00805f9b34fb Bluetooth is ready for communication! The characteristics that correspond to each UUID listed in the log can be found in the Open GoPro API. These will be used in a future tutorial to send data. Once the notifications are enabled, the GoPro BLE initialization is complete and it is ready to communicate via BLE. Quiz time! 📚 ✏️ How often is it necessary to pair? A: Pairing must occur every time to ensure safe BLE communication. B: We never need to pair as the GoPro does not require it to communicate. C: Pairing only needs to occur once as the keys will be automatically re-used for future connections. Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. Pairing is only needed once (assuming neither side deletes the keys). If the GoPro deletes the keys (via Connections->Reset Connections), the devices will need to re-pair. Troubleshooting Device not connecting If the connection is not starting, it is likely because the camera is not advertising. This can be due to either: The camera is not in pairing mode. Ensure that this is achieved as done in the advertise section. The devices never disconnected from the previous session so are thus already connected. If this is the case, perform the “Complete System Reset” shown below. Complete System Reset BLE is a fickle beast. If at any point it is impossible to discover or connect to the camera, perform the following. Reset the camera by choosing Connections –> Reset Connections Use your OS’s bluetooth settings GUI to remove / unpair the Gopro Restart the procedure detailed above Logs python kotlin The demo program has enabled bleak logs and is also using the default python logging module to write its own logs. To enable more bleak logs, follow bleak’s troubleshooting section. The demo program is using Timber. It is piping all log messages to the UI but they are also available in the logcat window and can be filtered using: package:mine tag:GP_. Good Job! Congratulations 🤙 You can now successfully connect to the GoPro via BLE and prepare it to receive / send data. To see how to send commands, you should advance to the next tutorial.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/connect-ble#"
+ },
+ {
+ "title": "Tutorial 2: Send BLE TLV Commands: ",
+ "excerpt": "This document will provide a walk-through tutorial to use the Open GoPro BLE Interface to send Type-Length-Value (TLV)commands and receive TLV responses. Commands in this sense are operations that are initiated by either: Writing to the Command Request UUID and receiving responses via the Command Response UUID. Writing to the Setting UUID and receiving responses via the Setting Response UUID A list of TLV commands can be found in the [Command ID Table]/OpenGoPro/ble/protocol/id_tables.htmlcommand-ids). This tutorial only considers sending these as one-off commands. That is, it does not consider state management / synchronization when sending multiple commands. This will be discussed in a future lab. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. It is suggested that you have first completed the connecting BLE tutorial before going through this tutorial. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 2 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements Set Shutter You can test sending the Set Shutter command to your camera through BLE using the following script: $ python ble_command_set_shutter.py See the help for parameter definitions: $ python ble_command_set_shutter.py --help usage: ble_command_set_shutter.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, set the shutter on, wait 2 seconds, then set the shutter off. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Load Preset Group You can test sending the Load Preset Group command to your camera through BLE using the following script: $ python ble_command_load_group.py See the help for parameter definitions: $ python ble_command_load_group.py --help usage: ble_command_load_group.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, then change the Preset Group to Video. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Set the Video Resolution You can test sending the Set Video Resolution command to your camera through BLE using the following script: $ python ble_command_set_resolution.py See the help for parameter definitions: $ python ble_command_set_resolution.py --help usage: ble_command_set_resolution.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, then change the resolution to 1080. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Set the Frames Per Second (FPS) You can test sending the Set FPS command to your camera through BLE using the following script: $ python ble_command_set_fps.py See the help for parameter definitions: $ python ble_command_set_fps.py --help usage: ble_command_set_fps.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, then attempt to change the fps to 240. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 2” from the dropdown and click on “Perform.” This requires that a GoPro is already connected via BLE, i.e. that Tutorial 1 was already run. You can check the BLE status at the top of the app. Perform Tutorial 2 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Setup We must first connect as was discussed in the connecting BLE tutorial. In this case, however, we are defining a functional (albeit naive) notification handler that will: Log byte data and handle that the notification was received on Check if the response is what we expected Set an event to notify the writer that the response was received This is a very simple handler; response parsing will be expanded upon in the next tutorial. python kotlin async def notification_handler(characteristic: BleakGATTCharacteristic, data: bytearray) -> None: logger.info(f'Received response at handle {characteristic.handle}: {data.hex(\":\")}') If this is the correct handle and the status is success, the command was a success if client.services.characteristics[characteristic.handle].uuid == response_uuid and data[2] == 0x00: logger.info(\"Command sent successfully\") Anything else is unexpected. This shouldn't happen else: logger.error(\"Unexpected response\") Notify the writer event.set() The event used above is a simple synchronization event that is only alerting the writer that a notification was received. For now, we’re just checking that the handle matches what is expected and that the status (third byte) is success (0x00). private val receivedData: Channel<UByteArray> = Channel() private fun naiveNotificationHandler(characteristic: UUID, data: UByteArray) { if ((characteristic == GoProUUID.CQ_COMMAND_RSP.uuid)) { CoroutineScope(Dispatchers.IO).launch { receivedData.send(data) } } } private val bleListeners by lazy { BleEventListener().apply { onNotification = ::naiveNotificationHandler } } The handler is simply verifying that the response was received on the correct UIUD and then notifying the received data. We are registering this notification handler with the BLE API before sending any data requests as such: ble.registerListener(goproAddress, bleListeners) There is much more to the synchronization and data parsing than this but this will be discussed in future tutorials. Command Overview All commands follow the same procedure: Write to the relevant request UUID Receive confirmation from GoPro (via notification from relevant response UUID) that request was received. GoPro reacts to command The notification response only indicates that the request was received and whether it was accepted or rejected. The relevant behavior of the GoPro must be observed to verify when the command’s effects have been applied. Here is the procedure from power-on to finish: GoProOpen GoPro user deviceGoProOpen GoPro user devicedevices are connected as in Tutorial 1Command Request (Write to Request UUID)Command Response (via notification to Response UUID)Apply effects of command when able Sending Commands Now that we are are connected, paired, and have enabled notifications (registered to our defined callback), we can send some commands. First, we need to define the UUIDs to write to / receive responses from, which are: python kotlin We’ll define these and any others used throughout the tutorials and store them in a GoProUUID class: class GoProUuid: COMMAND_REQ_UUID = GOPRO_BASE_UUID.format(\"0072\") COMMAND_RSP_UUID = GOPRO_BASE_UUID.format(\"0073\") SETTINGS_REQ_UUID = GOPRO_BASE_UUID.format(\"0074\") SETTINGS_RSP_UUID = GOPRO_BASE_UUID.format(\"0075\") QUERY_REQ_UUID = GOPRO_BASE_UUID.format(\"0076\") QUERY_RSP_UUID = GOPRO_BASE_UUID.format(\"0077\") WIFI_AP_SSID_UUID = GOPRO_BASE_UUID.format(\"0002\") WIFI_AP_PASSWORD_UUID = GOPRO_BASE_UUID.format(\"0003\") NETWORK_MANAGEMENT_REQ_UUID = GOPRO_BASE_UUID.format(\"0091\") NETWORK_MANAGEMENT_RSP_UUID = GOPRO_BASE_UUID.format(\"0092\") We’re using the GOPRO_BASE_UUID string imported from the module’s __init__.py to build these. These are defined in the GoProUUID class: const val GOPRO_UUID = \"0000FEA6-0000-1000-8000-00805f9b34fb\" const val GOPRO_BASE_UUID = \"b5f9%s-aa8d-11e3-9046-0002a5d5c51b\" enum class GoProUUID(val uuid: UUID) { WIFI_AP_PASSWORD(UUID.fromString(GOPRO_BASE_UUID.format(\"0003\"))), WIFI_AP_SSID(UUID.fromString(GOPRO_BASE_UUID.format(\"0002\"))), CQ_COMMAND(UUID.fromString(GOPRO_BASE_UUID.format(\"0072\"))), CQ_COMMAND_RSP(UUID.fromString(GOPRO_BASE_UUID.format(\"0073\"))), CQ_SETTING(UUID.fromString(GOPRO_BASE_UUID.format(\"0074\"))), CQ_SETTING_RSP(UUID.fromString(GOPRO_BASE_UUID.format(\"0075\"))), CQ_QUERY(UUID.fromString(GOPRO_BASE_UUID.format(\"0076\"))), CQ_QUERY_RSP(UUID.fromString(GOPRO_BASE_UUID.format(\"0077\"))); } Set Shutter The first command we will be sending is Set Shutter, which at byte level is: Command Bytes Set Shutter Off 0x03 0x01 0x01 0x00 Set Shutter On 0x03 0x01 0x01 0x01 Now, let’s write the bytes to the “Command Request” UUID to turn the shutter on and start encoding! python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID event.clear() request = bytes([3, 1, 1, 1]) await client.write_gatt_char(request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response We make sure to clear the synchronization event before writing, then pend on the event until it is set in the notification callback. val setShutterOnCmd = ubyteArrayOf(0x03U, 0x01U, 0x01U, 0x01U) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, setShutterOnCmd) // Wait to receive the notification response, then check its status checkStatus(receivedData.receive()) We’re waiting to receive the data from the queue that is posted to in the notification handler when the response is received. You should hear the camera beep and it will either take a picture or start recording depending on what mode it is in. Also note that we have received the “Command Status” notification response from the Command Response characteristic since we enabled its notifications in Enable Notifications. This can be seen in the demo log: python kotlin Setting the shutter on Writing to GoProUuid.COMMAND_REQ_UUID: 03:01:01:01 Received response at GoProUuid.COMMAND_RSP_UUID: 02:01:00 Command sent successfully Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 03:01:01:01 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:01:00 Received response on b5f90073-aa8d-11e3-9046-0002a5d5c51b: 02:01:00 Command sent successfully As expected, the response was received on the correct UUID and the status was “success” (third byte == 0x00). If you are recording a video, continue reading to set the shutter off: We’re waiting 2 seconds in case you are in video mode so that we can capture a 2 second video. python kotlin await asyncio.sleep(2) request_uuid = GoProUuid.COMMAND_REQ_UUID request = bytes([3, 1, 1, 0]) event.clear() await client.write_gatt_char(request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response This will log in the console as follows: Setting the shutter off Writing to GoProUuid.COMMAND_REQ_UUID: 03:01:01:00 Received response at GoProUuid.COMMAND_RSP_UUID: 02:01:00 Command sent successfully delay(2000) val setShutterOffCmd = ubyteArrayOf(0x03U, 0x01U, 0x01U, 0x00U) // Wait to receive the notification response, then check its status checkStatus(receivedData.receive()) We’re waiting to receive the data from the queue that is posted to in the notification handler when the response is received. This will log as such: Setting the shutter off Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 03:01:01:00 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:01:00 Received response on b5f90073-aa8d-11e3-9046-0002a5d5c51b: 02:01:00 Command sent successfully Load Preset Group The next command we will be sending is Load Preset Group, which is used to toggle between the 3 groups of presets (video, photo, and timelapse). At byte level, the commands are: Command Bytes Load Video Preset Group 0x04 0x3E 0x02 0x03 0xE8 Load Photo Preset Group 0x04 0x3E 0x02 0x03 0xE9 Load Timelapse Preset Group 0x04 0x3E 0x02 0x03 0xEA Now, let’s write the bytes to the “Command Request” UUID to change the preset group to Video! python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID request = bytes([0x04, 0x3E, 0x02, 0x03, 0xE8]) event.clear() await client.write_gatt_char(request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response We make sure to clear the synchronization event before writing, then pend on the event until it is set in the notification callback. val loadPreset = ubyteArrayOf(0x04U, 0x3EU, 0x02U, 0x03U, 0xE8U) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, loadPreset) // Wait to receive the notification response, then check its status checkStatus(receivedData.receive()) We’re waiting to receive the data from the queue that is posted to in the notification handler when the response is received. You should hear the camera beep and move to the Video Preset Group. You can tell this by the logo at the top middle of the screen: Load Preset Group Also note that we have received the “Command Status” notification response from the Command Response characteristic since we enabled its notifications in Enable Notifications. This can be seen in the demo log: python kotlin Loading the video preset group... Sending to GoProUuid.COMMAND_REQ_UUID: 04:3e:02:03:e8 Received response at GoProUuid.COMMAND_RSP_UUID: 02:3e:00 Command sent successfully Loading Video Preset Group Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 04:3E:02:03:E8 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:3E:00 Received response on b5f90073-aa8d-11e3-9046-0002a5d5c51b: 02:3E:00 Command status received Command sent successfully As expected, the response was received on the correct UUID and the status was “success” (third byte == 0x00). Set the Video Resolution The next command we will be sending is Set Setting to set the Video Resolution. This is used to change the value of the Video Resolution setting. It is important to note that this only affects video resolution (not photo). Therefore, the Video Preset Group must be active in order for it to succeed. This can be done either manually through the camera UI or by sending Load Preset Group. This resolution only affects the current video preset. Each video preset can have its own independent values for video resolution. Here are some of the byte level commands for various video resolutions. Command Bytes Set Video Resolution to 1080 0x03 0x02 0x01 0x09 Set Video Resolution to 2.7K 0x03 0x02 0x01 0x04 Set Video Resolution to 5K 0x03 0x02 0x01 0x18 Now, let’s write the bytes to the “Setting Request” UUID to change the video resolution to 1080! python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID request = bytes([0x03, 0x02, 0x01, 0x09]) event.clear() await client.write_gatt_char(request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response We make sure to clear the synchronization event before writing, then pend on the event until it is set in the notification callback. val setResolution = ubyteArrayOf(0x03U, 0x02U, 0x01U, 0x09U) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, setResolution) // Wait to receive the notification response, then check its status checkStatus(receivedData.receive()) We’re waiting to receive the data from the queue that is posted to in the notification handler when the response is received. You should see the video resolution change to 1080 in the pill in the bottom-middle of the screen: Set Video Resolution Also note that we have received the “Command Status” notification response from the Command Response characteristic since we enabled its notifications in Enable Notifications. This can be seen in the demo log: python kotlin Setting the video resolution to 1080 Writing to GoProUuid.SETTINGS_REQ_UUID: 03:02:01:09 Received response at GoProUuid.SETTINGS_RSP_UUID: 02:02:00 Command sent successfully Setting resolution to 1080 Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 03:02:01:09 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:02:00 Received response on b5f90073-aa8d-11e3-9046-0002a5d5c51b: 02:02:00 Command status received Command sent successfully As expected, the response was received on the correct UUID and the status was “success” (third byte == 0x00). If the Preset Group was not Video, the status will not be success. Set the Frames Per Second (FPS) The next command we will be sending is Set Setting to set the FPS. This is used to change the value of the FPS setting. It is important to note that this setting is dependent on the video resolution. That is, certain FPS values are not valid with certain resolutions. In general, higher resolutions only allow lower FPS values. Other settings such as the current anti-flicker value may further limit possible FPS values. Futhermore, these capabilities all vary by camera. Check the camera capabilities to see which FPS values are valid for given use cases. Therefore, for this step of the tutorial, it is assumed that the resolution has been set to 1080 as in Set the Video Resolution. Here are some of the byte level commands for various FPS values. Command Bytes Set FPS to 24 0x03 0x03 0x01 0x0A Set FPS to 60 0x03 0x03 0x01 0x05 Set FPS to 240 0x03 0x03 0x01 0x00 Note that the possible FPS values can vary based on the Camera that is being operated on. Now, let’s write the bytes to the “Setting Request” UUID to change the FPS to 240! python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID request = bytes([0x03, 0x03, 0x01, 0x00]) event.clear() await client.write_gatt_char(request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response We make sure to clear the synchronization event before writing, then pend on the event until it is set in the notification callback. val setFps = ubyteArrayOf(0x03U, 0x03U, 0x01U, 0x00U) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, setFps) // Wait to receive the notification response, then check its status checkStatus(receivedData.receive()) We’re waiting to receive the data from the queue that is posted to in the notification handler when the response is received. You should see the FPS change to 240 in the pill in the bottom-middle of the screen: Set FPS Also note that we have received the “Command Status” notification response from the Command Response characteristic since we enabled its notifications in Enable Notifications.. This can be seen in the demo log: python kotlin Setting the fps to 240 Writing to GoProUuid.SETTINGS_REQ_UUID: 03:03:01:00 Received response at GoProUuid.SETTINGS_RSP_UUID: 02:03:00 Command sent successfully Setting the FPS to 240 Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 03:03:01:00 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:03:00 Received response on b5f90073-aa8d-11e3-9046-0002a5d5c51b: 02:03:00 Command status received Command sent successfully As expected, the response was received on the correct UUID and the status was “success” (third byte == 0x00). If the video resolution was higher, for example 5K, this would fail. Quiz time! 📚 ✏️ Which of the following is not a real preset group? A: Timelapse B: Photo C: Burst D: Video Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. There are 3 preset groups (Timelapse, Photo, and Video). These can be set via the Load Preset Group command. True or False: Every combination of resolution and FPS value is valid. A: True B: False Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is B. Each resolution can support all or only some FPS values. You can find out which resolutions support which FPS values by consulting the capabilities section of the spec. True or False: Every camera supports the same combination of resolution and FPS values. A: True B: False Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is B. The only way to know what values are supported is to first check the Open GoPro version. See the relevant version of the BLE or WiFi spec to see what is supported. Troubleshooting See the first tutorial’s troubleshooting section. Good Job! Congratulations 🤙 You can now send any of the other BLE commands detailed in the Open GoPro documentation in a similar manner. To see how to parse responses, proceed to the next tutorial.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/send-ble-commands#"
+ },
+ {
+ "title": "Tutorial 3: Parse BLE TLV Responses: ",
+ "excerpt": "This document will provide a walk-through tutorial to implement the Open GoPro Interface to parse BLE Type-Length-Value (TLV) Responses. Besides TLV, some BLE operations instead return protobuf responses. These are not considered here and will be discussed in a future tutorial This tutorial will provide an overview of how to handle responses of both single and multiple packets lengths, then give parsing examples for each case, and finally create Response and TlvResponse classes that will be reused in future tutorials. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. It is suggested that you have first completed the connect and sending commands tutorials before going through this tutorial. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 3 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements Parsing a One Packet TLV Response You can test parsing a one packet TLV response with your camera through BLE using the following script: $ python ble_command_get_version.py See the help for parameter definitions: $ python ble_command_get_version.py --help usage: ble_command_get_version.py [-h] [-i IDENTIFIER] Connect to a GoPro camera via BLE, then get the Open GoPro version. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Parsing Multiple Packet TLV Responses You can test parsing multiple packet TVL responses with your camera through BLE using the following script: $ python ble_command_get_hardware_info.py See the help for parameter definitions: $ python ble_command_get_hardware_info.py --help usage: ble_command_get_hardware_info.py [-h] [-i IDENTIFIER] Connect to a GoPro camera via BLE, then get its hardware info. options: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 3” from the dropdown and click on “Perform.” This requires that a GoPro is already connected via BLE, i.e. that Tutorial 1 was already run. You can check the BLE status at the top of the app. Perform Tutorial 3 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Setup We must first connect as was discussed in the connecting BLE tutorial. When enabling notifications, one of the notification handlers described in the following sections will be used. Response Overview In the preceding tutorials, we have been using a very simple response handling procedure where the notification handler simply checks that the UUID is the expected UUID and that the status byte of the response is 0 (Success). This has been fine since we were only performing specific operations where this works and we know that the sequence always appears as such (connection sequence left out for brevity): GoProOpen GoPro user deviceGoProOpen GoPro user devicedevices are connected as in Tutorial 1Write to characteristicNotification Response (MSB == 0 (start)) In actuality, responses can be more complicated. As described in the BLE Spec, responses can be be comprised of multiple packets where each packet is <= 20 bytes such as: GoProOpen GoPro user deviceGoProOpen GoPro user devicedevices are connected as in Tutorial 1Write to characteristicNotification Response (MSB == 0 (start))Notification Response (MSB == 1 (continuation))Notification Response (MSB == 1 (continuation))Notification Response (MSB == 1 (continuation)) This requires the implementation of accumulating and parsing algorithms which will be described below. Parsing a One Packet TLV Response This section will describe how to parse one packet (<= 20 byte) responses. A one-packet response is formatted as such: Header (length) Operation ID Status Response 1 byte 1 byte 1 bytes Length - 2 bytes Responses with Payload Length 0 These are the only responses that we have seen thus far through the first 2 tutorials. They return a status but have a 0 length additional response. For example, consider Set Shutter. It returned a response of: 02:01:00 This equates to: Header (length) Command ID Status Response 1 byte 1 byte 1 bytes Length - 2 bytes 0x02 0x01 == Set Shutter 0x00 == Success (2 -2 = 0 bytes) We can see how this response includes the status but no additional response data. This type of response will be used for most Commands and Setting Responses as seen in the previous tutorial. Responses with Payload However, there are some operations that do return additional response data. These are identified by the presence of parameters in their Response documentation as shown in the red box here: Response With Payload In this tutorial, we will walk through creating a simple parser to parse the Open GoPro Get Version Command which is an example of such an operation. It is important to always query the version after connecting in order to know which API is supported. See the relevant version of the BLE and / or WiFi spec for more details about each version. First, we send the Get Version Command to the Command Request UUID in the same manner as commands were sent in the previous tutorial: python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID request = bytes([0x01, 0x51]) await client.write_gatt_char(request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response We receive a response at the expected handle (as a TLV Response). This is logged as: Getting the Open GoPro version... Writing to GoProUuid.COMMAND_REQ_UUID: 01:51 Received response GoProUuid.COMMAND_RSP_UUID: 06:51:00:01:02:01:00 val versionRequest = ubyteArrayOf(0x01U, 0x51U) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, versionRequest) var tlvResponse = receivedResponses.receive() as Response.Tlv We then receive a response at the expected handle. This is logged as: This is logged as such: Getting the Open GoPro version Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 01:51 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 06:51:00:01:02:01:00 Received response on CQ_COMMAND_RSP Received packet of length 6. 0 bytes remaining This response equates to: Header (length) Command ID Status Response 1 byte 1 byte 1 bytes Length - 2 bytes 0x06 0x51 == Get Version 0x00 == Success 0x01 0x02 0x01 0x00 We can see that this response payload contains 4 additional bytes that need to be parsed. Using the information from the Get Version Documentation, we know to parse this as: Byte Meaning 0x01 Length of Major Version Number 0x02 Major Version Number of length 1 byte 0x01 Length of Minor Version Number 0x00 Minor Version Number of length 1 byte We implement this as follows. First, we parse the length, command ID, and status from the first 3 bytes of the response. The remainder is stored as the payload. This is all of the common parsing across TLV Responses. Each individual response will document how to further parse the payload. python kotlin The snippets of code included in this section are taken from the notification handler First byte is the length of this response. length = data[0] Second byte is the ID command_id = data[1] Third byte is the status status = data[2] The remainder is the payload payload = data[3 : length + 1] The snippets of code included in this section are taken from the Response.Tlv.Parse method // Parse header bytes tlvResponse.parse() ... open fun parse() { require(isReceived) id = rawBytes[0].toInt() status = rawBytes[1].toInt() // Store remainder as payload payload = rawBytes.drop(2).toUByteArray() } From the response definition, we know these parameters are one byte each and equate to the major and the minor version so let’s print them (and all of the other response information) as such: python kotlin major_length = payload[0] payload.pop(0) major = payload[:major_length] payload.pop(major_length) minor_length = payload[0] payload.pop(0) minor = payload[:minor_length] logger.info(f\"The version is Open GoPro {major[0]}.{minor[0]}\") logger.info(f\"Received a response to {command_id=} with {status=}: version={major[0]}.{minor[0]}\") which shows on the log as: Received a response to command_id=81 with status=0, payload=01:02:01:00 The version is Open GoPro 2.0 The snippets of code included in this section are taken from the OpenGoProVersion from_bytes method. This class is a simple data class to contain the Get Version information. var buf = data.toUByteArray() val minorLen = buf[0].toInt() buf = buf.drop(1).toUByteArray() val minor = buf.take(minorLen).toInt() val majorLen = buf[0].toInt() buf = buf.drop(1).toUByteArray() val major = buf.take(majorLen).toInt() return OpenGoProVersion(minor, major) which shows on the log as such: Received response: ID: 81, Status: 0, Payload: 01:02:01:00 Got the Open GoPro version successfully: 2.0 Quiz time! 📚 ✏️ What is the maximum size of an individual notification response packet at the Open GoPro application layer? A: 20 bytes B: 256 bytes C: There is no maximum size Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is A. Responses can be composed of multiple packets where each packet is at maximum 20 bytes. What is the maximum amount of bytes that one response can be composed of? A: 20 bytes B: 256 bytes C: There is no maximum size Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. There is no limit on the amount of packets that can comprise a response. How many packets are command responses composed of? A: Always 1 packet B: Always multiple packets. C: A variable amount of packets depending on the payload size Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. Command responses are sometimes 1 packet (just returning the status). Other times, command responses also contain a payload and can thus be multiple packets if the payload is big enough (i.e. in the case of Get Hardware Info). This is described in the per-command documentation in the BLE spec. How many packets are setting responses comprised of? A: Always 1 packet B: Always multiple packets. C: A variable amount of packets depending on the payload size Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is A. Settings Responses only ever contain the response status. Parsing Multiple Packet TLV Responses This section will describe parsing TLV responses that contain more than one packet. It will first describe how to accumulate such responses and then provide a parsing example. We will be creating small Response and TlvResponse classes that will be re-used for future tutorials. Accumulating the Response The first step is to accumulate the multiple packets into one response. Whereas for all tutorials until now, we have just used the header bytes of the response as the length, we now must completely parse the headers as they are defined, reproduced for reference here: Byte 1 Byte 2 (optional) Byte 3 (optional) 7 6 5 4 3 2 1 0 7 6 5 4 3 2 1 0 7 6 5 4 3 2 1 0 0: Start 00: General Message Length: 5 bits 0: Start 01: Extended (13-bit) Message Length: 13 bits 0: Start 10: Extended (16-bit) Message Length: 16 bits 0: Start 11: Reserved 1: Continuation The basic accumulation algorithm (which is implemented in the Response.Accumulate method) is as follows: Is the continuation bit set? python kotlin The example script that will be walked through for this section is ble_command_get_hardware_info.py. if buf[0] & CONT_MASK: buf.pop(0) else: ... if (data.first().and(Mask.Continuation.value) == Mask.Continuation.value) { buf = buf.drop(1).toUByteArray() // Pop the header byte } else { // This is a new packet ... No, the continuation bit was not set. Therefore create new response, then get its length. python kotlin This is a new packet so start with an empty byte array self.bytes = bytearray() hdr = Header((buf[0] & HDR_MASK) >> 5) if hdr is Header.GENERAL: self.bytes_remaining = buf[0] & GEN_LEN_MASK buf = buf[1:] elif hdr is Header.EXT_13: self.bytes_remaining = ((buf[0] & EXT_13_BYTE0_MASK) << 8) + buf[1] buf = buf[2:] elif hdr is Header.EXT_16: self.bytes_remaining = (buf[1] << 8) + buf[2] buf = buf[3:] // This is a new packet so start with empty array packet = ubyteArrayOf() when (Header.fromValue((buf.first() and Mask.Header.value).toInt() shr 5)) { Header.GENERAL -> { bytesRemaining = buf[0].and(Mask.GenLength.value).toInt() buf = buf.drop(1).toUByteArray() } Header.EXT_13 -> { bytesRemaining = ((buf[0].and(Mask.Ext13Byte0.value) .toLong() shl 8) or buf[1].toLong()).toInt() buf = buf.drop(2).toUByteArray() } Header.EXT_16 -> { bytesRemaining = ((buf[1].toLong() shl 8) or buf[2].toLong()).toInt() buf = buf.drop(3).toUByteArray() } Header.RESERVED -> { throw Exception(\"Unexpected RESERVED header\") } } Append current packet to response and decrement bytes remaining. python kotlin Append payload to buffer and update remaining / complete self.bytes.extend(buf) self.bytes_remaining -= len(buf) // Accumulate the payload now that headers are handled and dropped packet += buf bytesRemaining -= buf.size In the notification handler, we are then enqueueing the received response if there are no bytes remaining. python kotlin if response.is_received: ... await received_responses.put(response) and finally parsing the payload back in the main task after it receives the accumulated response from the queue which, at the current TLV Response level, is just extracting the ID, status, and payload: class TlvResponse(Response): def parse(self) -> None: self.id = self.raw_bytes[0] self.status = self.raw_bytes[1] self.payload = self.raw_bytes[2:] ... response = await received_responses.get() response.parse() if (response.isReceived) { if (uuid == GoProUUID.CQ_COMMAND_RSP) { CoroutineScope(Dispatchers.IO).launch { receivedResponses.send(response) } } ... NoYesDecrement bytes remainingYesNoRead Available PacketContinuation bit set?Create new empty responseGet bytes remaining, i.e. lengthAppend packet to accumulating responseBytes remaining == 0?Parse Received Packet We can see this in action when we send the Get Hardware Info Command: python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID request = bytearray([0x01, 0x3C]) await client.write_gatt_char(request_uuid.value, request, response=True) response = await received_responses.get() val hardwareInfoRequest = ubyteArrayOf(0x01U, 0x3CU) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, hardwareInfoRequest) Then, in the notification handler, we continuously receive and accumulate packets (per UUID) until we have received an entire response, at which point we perform common TLV parsing (via the TlvResponse’s parse method) to extract Command ID, Status, and payload. Then we enqueue the received response to notify the writer that the response is ready. Finally we reset the per-UUID response to prepare it to receive a new response. This notification handler is only designed to handle TlvResponses. This is fine for this tutorial since that is all we will be receiving. python kotlin request_uuid = GoProUuid.COMMAND_REQ_UUID response_uuid = GoProUuid.COMMAND_RSP_UUID responses_by_uuid = GoProUuid.dict_by_uuid(TlvResponse) received_responses: asyncio.Queue[TlvResponse] = asyncio.Queue() async def tlv_notification_handler(characteristic: BleakGATTCharacteristic, data: bytearray) -> None: uuid = GoProUuid(client.services.characteristics[characteristic.handle].uuid) response = responses_by_uuid[uuid] response.accumulate(data) if response.is_received: If this is the correct handle, enqueue it for processing if uuid is response_uuid: logger.info(\"Received the get hardware info response\") await received_responses.put(response) Anything else is unexpected. This shouldn't happen else: logger.error(\"Unexpected response\") Reset the per-UUID response responses_by_uuid[uuid] = TlvResponse(uuid) private fun notificationHandler(characteristic: UUID, data: UByteArray) { ... responsesByUuid[uuid]?.let { response -> response.accumulate(data) if (response.isReceived) { if (uuid == GoProUUID.CQ_COMMAND_RSP) { CoroutineScope(Dispatchers.IO).launch { receivedResponses.send(response) } } ... responsesByUuid[uuid] = Response.muxByUuid(uuid) } } } We can see the individual packets being accumulated in the log: python kotlin Getting the camera's hardware info... Writing to GoProUuid.COMMAND_REQ_UUID: 01:3c Received response at handle 47: 20:62:3c:00:04:00:00:00:3e:0c:48:45:52:4f:31:32:20:42:6c:61 self.bytes_remaining=80 Received response at handle 47: 80:63:6b:04:30:78:30:35:0f:48:32:33:2e:30:31:2e:30:31:2e:39 self.bytes_remaining=61 Received response at handle 47: 81:39:2e:35:36:0e:43:33:35:30:31:33:32:34:35:30:30:37:30:32 self.bytes_remaining=42 Received response at handle 47: 82:11:48:45:52:4f:31:32:20:42:6c:61:63:6b:64:65:62:75:67:0c self.bytes_remaining=23 Received response at handle 47: 83:32:36:37:34:66:37:66:36:36:31:30:34:01:00:01:01:01:00:02 self.bytes_remaining=4 Received response at handle 47: 84:5b:5d:01:01 self.bytes_remaining=0 Received the get hardware info response Getting the Hardware Info Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 01:3C Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 20:5B:3C:00:04:00:00:00:3E:0C:48:45:52:4F:31:32:20:42:6C:61 Received response on CQ_COMMAND_RSP Received packet of length 18. 73 bytes remaining Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 80:63:6B:04:30:78:30:35:0F:48:32:33:2E:30:31:2E:30:31:2E:39 Received response on CQ_COMMAND_RSP Received packet of length 19. 54 bytes remaining Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 81:39:2E:35:36:0E:43:33:35:30:31:33:32:34:35:30:30:37:30:32 Received response on CQ_COMMAND_RSP Received packet of length 19. 35 bytes remaining Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 82:0A:47:50:32:34:35:30:30:37:30:32:0C:32:36:37:34:66:37:66 Received response on CQ_COMMAND_RSP Received packet of length 19. 16 bytes remaining Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 83:36:36:31:30:34:01:00:01:01:01:00:02:5B:5D:01:01 Received response on CQ_COMMAND_RSP Received packet of length 16. 0 bytes remaining At this point the response has been accumulated. We then parse and log the payload using the Get Hardware Info response documentation: python kotlin hardware_info = HardwareInfo.from_bytes(response.payload) logger.info(f\"Received hardware info: {hardware_info}\") where the parsing is done as such: @classmethod def from_bytes(cls, data: bytes) -> HardwareInfo: buf = bytearray(data) Get model number model_num_length = buf.pop(0) model = int.from_bytes(buf[:model_num_length]) buf = buf[model_num_length:] Get model name model_name_length = buf.pop(0) model_name = (buf[:model_name_length]).decode() buf = buf[model_name_length:] Advance past deprecated bytes deprecated_length = buf.pop(0) buf = buf[deprecated_length:] Get firmware version firmware_length = buf.pop(0) firmware = (buf[:firmware_length]).decode() buf = buf[firmware_length:] Get serial number serial_length = buf.pop(0) serial = (buf[:serial_length]).decode() buf = buf[serial_length:] Get AP SSID ssid_length = buf.pop(0) ssid = (buf[:ssid_length]).decode() buf = buf[ssid_length:] Get MAC address mac_length = buf.pop(0) mac = (buf[:mac_length]).decode() buf = buf[mac_length:] return cls(model, model_name, firmware, serial, ssid, mac) This logs as: Parsed hardware info: { \"model_name\": \"HERO12 Black\", \"firmware_version\": \"H23.01.01.99.56\", \"serial_number\": \"C3501324500702\", \"ap_ssid\": \"HERO12 Blackdebug\", \"ap_mac_address\": \"2674f7f66104\" } tlvResponse.parse() val hardwareInfo = HardwareInfo.fromBytes(tlvResponse.payload) where the parsing is done as such: fun fromBytes(data: UByteArray): HardwareInfo { // Parse header bytes var buf = data.toUByteArray() // Get model number val modelNumLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() val model = buf.take(modelNumLength).toInt() buf = buf.drop(modelNumLength).toUByteArray() // Get model name val modelNameLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() val modelName = buf.take(modelNameLength).decodeToString() buf = buf.drop(modelNameLength).toUByteArray() // Advance past deprecated bytes val deprecatedLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() buf = buf.drop(deprecatedLength).toUByteArray() // Get firmware version val firmwareLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() val firmware = buf.take(firmwareLength).decodeToString() buf = buf.drop(firmwareLength).toUByteArray() // Get serial number val serialLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() val serial = buf.take(serialLength).decodeToString() buf = buf.drop(serialLength).toUByteArray() // Get AP SSID val ssidLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() val ssid = buf.take(ssidLength).decodeToString() buf = buf.drop(ssidLength).toUByteArray() // Get MAC Address val macLength = buf.first().toInt() buf = buf.drop(1).toUByteArray() val mac = buf.take(macLength).decodeToString() return HardwareInfo(model, modelName, firmware, serial, ssid, mac) } This logs as: Got the Hardware Info successfully: HardwareInfo( modelNumber=1040187392, modelName=HERO12 Black, firmwareVersion=H23.01.01.99.56, serialNumber=C3501324500702, apSsid=GP24500702, apMacAddress=2674f7f66104 ) Quiz time! 📚 ✏️ How can we know that a response has been completely received? A: The stop bit will be set in the header B: The response has accumulated length bytes C: By checking for the end of frame (EOF) sentinel character Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is B. The length of the entire response is parsed from the first packet. We then accumulate packets, keeping track of the received length, until all of the bytes have been received. A and C are just made up 😜. Troubleshooting See the first tutorial’s troubleshooting section. Good Job! Congratulations 🤙 You now know how to accumulate TLV responses that are received from the GoPro, at least if they are received uninterrupted. There is additional logic required for a complete solution such as checking the UUID the response is received on and storing a dict of response per UUID. At the current time, this endeavor is left for the reader. For a complete example of this, see the Open GoPro Python SDK. To learn about a different type of operation (Queries), go to the next tutorial.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/parse-ble-responses#"
+ },
+ {
+ "title": "Tutorial 4: BLE TLV Queries: ",
+ "excerpt": "This document will provide a walk-through tutorial to use the Open GoPro Interface to query the camera’s setting and status information via BLE. Queries in this sense are operations that are initiated by writing to the Query UUID and receiving responses via the Query Response UUID. A list of queries can be found in the Query ID Table. It is important to distinguish between queries and commands because they each have different request and response packet formats. This tutorial only considers sending these queries as one-off queries. That is, it does not consider state management / synchronization when sending multiple queries. This will be discussed in a future lab. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. It is suggested that you have first completed the connect, sending commands, and parsing responses tutorials before going through this tutorial. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 4 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements Individual Query Poll You can test an individual query poll with your camera through BLE using the following script: $ python ble_query_poll_resolution_value.py See the help for parameter definitions: $ python ble_query_poll_resolution_value.py --help usage: ble_query_poll_resolution_value.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, get the current resolution, modify the resolution, and confirm the change was successful. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Multiple Simultaneous Query Polls You can test querying multiple queries simultaneously with your camera through BLE using the following script: $ python ble_query_poll_multiple_setting_values.py See the help for parameter definitions: $ python ble_query_poll_multiple_setting_values.py --help usage: ble_query_poll_multiple_setting_values.py [-h] [-i IDENTIFIER] Connect to a GoPro camera then get the current resolution, fps, and fov. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Registering for Query Push Notifications You can test registering for querties and receiving push notifications with your camera through BLE using the following script: $ python ble_query_register_resolution_value_updates.py See the help for parameter definitions: $ python ble_query_register_resolution_value_updates.py --help usage: ble_query_register_resolution_value_updates.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, register for updates to the resolution, receive the current resolution, modify the resolution, and confirm receipt of the change notification. optional arguments: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 4” from the dropdown and click on “Perform.” This requires that a GoPro is already connected via BLE, i.e. that Tutorial 1 was already run. You can check the BLE status at the top of the app. Perform Tutorial 4 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Setup We must first connect as was discussed in the connecting BLE tutorial. python kotlin We have slightly updated the notification handler from the previous tutorial to handle a QueryResponse instead of a TlvResponse where QueryResponse is a subclass of TlvResponse that will be created in this tutorial. responses_by_uuid = GoProUuid.dict_by_uuid(QueryResponse) received_responses: asyncio.Queue[QueryResponse] = asyncio.Queue() query_request_uuid = GoProUuid.QUERY_REQ_UUID query_response_uuid = GoProUuid.QUERY_RSP_UUID setting_request_uuid = GoProUuid.SETTINGS_REQ_UUID setting_response_uuid = GoProUuid.SETTINGS_RSP_UUID async def notification_handler(characteristic: BleakGATTCharacteristic, data: bytearray) -> None: uuid = GoProUuid(client.services.characteristics[characteristic.handle].uuid) response = responses_by_uuid[uuid] response.accumulate(data) Notify the writer if we have received the entire response if response.is_received: If this is query response, it must contain a resolution value if uuid is query_response_uuid: logger.info(\"Received a Query response\") await received_responses.put(response) If this is a setting response, it will just show the status elif uuid is setting_response_uuid: logger.info(\"Received Set Setting command response.\") await received_responses.put(response) Anything else is unexpected. This shouldn't happen else: logger.error(\"Unexpected response\") Reset per-uuid Response responses_by_uuid[uuid] = QueryResponse(uuid) The code above is taken from ble_query_poll_resolution_value.py We are defining a resolution enum that will be updated as we receive new resolutions: private enum class Resolution(val value: UByte) { RES_4K(1U), RES_2_7K(4U), RES_2_7K_4_3(6U), RES_1080(9U), RES_4K_4_3(18U), RES_5K(24U); companion object { private val valueMap: Map<UByte, Resolution> by lazy { values().associateBy { it.value } } fun fromValue(value: UByte) = valueMap.getValue(value) } } private lateinit var resolution: Resolution There are two methods to query status / setting information, each of which will be described in a following section: Polling Query Information Registering for query push notifications Parsing a Query Response Before sending queries, we must first describe how Query response parsing differs from the Command response parsing that was introduced in the previous tutorial. To recap, the generic response format for both Commands and Queries is: Header (length) Operation ID (Command / Query ID) Status Response 1-2 bytes 1 byte 1 bytes Length - 2 bytes Query Responses contain an array of additional TLV groups in the Response field as such: ID1 Length1 Value1 ID2 Length2 Value 2 … IDN LengthN ValueN 1 byte 1 byte Length1 bytes 1 byte 1 byte Length2 bytes … 1 byte 1 byte LengthN bytes We will be extending the TlvResponse class that was defined in the parsing responses tutorial to perform common parsing shared among all queries into a QueryResponse class as seen below: We have already parsed the length, Operation ID, and status, and extracted the payload in the TlvResponse class. The next step is to parse the payload. Therefore, we now continuously parse Type (ID) - Length - Value groups until we have consumed the response. We are storing each value in a hash map indexed by ID for later access. python kotlin class QueryResponse(TlvResponse): ... def parse(self) -> None: super().parse() buf = bytearray(self.payload) while len(buf) > 0: Get ID and Length of query parameter param_id = buf[0] param_len = buf[1] buf = buf[2:] Get the value value = buf[:param_len] Store in dict for later access self.data[param_id] = bytes(value) Advance the buffer buf = buf[param_len:] while (buf.isNotEmpty()) { // Get each parameter's ID and length val paramId = buf[0] val paramLen = buf[1].toInt() buf = buf.drop(2) // Get the parameter's value val paramVal = buf.take(paramLen) // Store in data dict for access later data[paramId] = paramVal.toUByteArray() // Advance the buffer for continued parsing buf = buf.drop(paramLen) } yesnoParse Query IDParse StatusMore data?Get Value IDGet Value LengthGet Valuedone How many packets are query responses? A: Always 1 packet B: Always multiple packets C: Can be 1 or multiple packets Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. Query responses can be one packet (if for example querying a specific setting) or multiple packets (when querying many or all settings as in the example here). Which field is not common to all TLV responses? A: length B: status C: ID D: None of the Above Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is D. All Commands and Query responses have a length, ID, and status. Polling Query Information It is possible to poll one or more setting / status values using the following queries: Query ID Request Query 0x12 [Get Setting value(s)](/OpenGoPro/ble/features/query.htmlget-setting-values) len:12:xx:xx 0x13 [Get Status value(s)](/OpenGoPro/ble/features/query.htmlget-status-values) len:13:xx:xx where xx are setting / status ID(s) and len is the length of the rest of the query (the number of query bytes plus one for the request ID byte). There will be specific examples below. Since they are two separate queries, combination of settings / statuses can not be polled simultaneously. Here is a generic sequence diagram (the same is true for statuses): GoProOpen GoPro user deviceGoProOpen GoPro user deviceConnected (steps from connect tutorial)Get Setting value(s) queries written to Query UUIDSetting values responded to Query Response UUIDMore setting values responded to Query Response UUID...More setting values responded to Query Response UUID The number of notification responses will vary depending on the amount of settings that have been queried. Note that setting values will be combined into one notification until it reaches the maximum notification size (20 bytes). At this point, a new response will be sent. Therefore, it is necessary to accumulate and then parse these responses as was described in parsing query responses Individual Query Poll Here we will walk through an example of polling one setting (Resolution). First we send the query: python kotlin The sample code can be found in in ble_query_poll_resolution_value.py. query_request_uuid = GoProUuid.QUERY_REQ_UUID request = bytes([0x02, 0x12, RESOLUTION_ID]) await client.write_gatt_char(query_request_uuid.value, request, response=True) val pollResolution = ubyteArrayOf(0x02U, 0x12U, RESOLUTION_ID) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_QUERY.uuid, pollResolution) Then when the response is received from the notification handler we parse it into individual query elements in the QueryResponse class and extract the new resolution value. python kotlin Wait to receive the notification response response = await received_responses.get() response.parse() resolution = Resolution(response.data[RESOLUTION_ID][0]) which logs as such: Getting the current resolution Writing to GoProUuid.QUERY_REQ_UUID: 02:12:02 Received response at handle=62: b'05:12:00:02:01:09' eceived the Resolution Query response Resolution is currently Resolution.RES_1080 // Wait to receive the response and then convert it to resolution val queryResponse = (receivedResponses.receive() as Response.Query).apply { parse() } resolution = Resolution.fromValue(queryResponse.data.getValue(RESOLUTION_ID).first()) which logs as such: Polling the current resolution Writing characteristic b5f90076-aa8d-11e3-9046-0002a5d5c51b ==> 02:12:02 Wrote characteristic b5f90076-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90077-aa8d-11e3-9046-0002a5d5c51b changed | value: 05:12:00:02:01:09 Received response on CQ_QUERY_RSP Received packet of length 5. 0 bytes remaining Received Query Response Camera resolution is RES_1080 For verification purposes, we are then changing the resolution and polling again to verify that the setting has changed: python kotlin while resolution is not target_resolution: request = bytes([0x02, 0x12, RESOLUTION_ID]) await client.write_gatt_char(query_request_uuid.value, request, response=True) response = await received_responses.get() Wait to receive the notification response response.parse() resolution = Resolution(response.data[RESOLUTION_ID][0]) which logs as such: Changing the resolution to Resolution.RES_2_7K... Writing to GoProUuid.SETTINGS_REQ_UUID: 03:02:01:04 Writing to GoProUuid.SETTINGS_REQ_UUID: 03:02:01:04 Received response at GoProUuid.SETTINGS_RSP_UUID: 02:02:00 Received Set Setting command response. Polling the resolution to see if it has changed... Writing to GoProUuid.QUERY_REQ_UUID: 02:12:02 Received response at GoProUuid.QUERY_RSP_UUID: 05:12:00:02:01:04 Received the Resolution Query response Resolution is currently Resolution.RES_2_7K Resolution has changed as expected. Exiting... while (resolution != newResolution) { ble.writeCharacteristic(goproAddress, GoProUUID.CQ_QUERY.uuid, pollResolution) val queryNotification = (receivedResponses.receive() as Response.Query).apply { parse() } resolution = Resolution.fromValue(queryNotification.data.getValue(RESOLUTION_ID).first()) } which logs as such: Changing the resolution to RES_2_7K Writing characteristic b5f90074-aa8d-11e3-9046-0002a5d5c51b ==> 03:02:01:04 Wrote characteristic b5f90074-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90075-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:02:00 Received response on CQ_SETTING_RSP Received packet of length 2. 0 bytes remaining Received set setting response. Resolution successfully changed Polling the resolution until it changes Writing characteristic b5f90076-aa8d-11e3-9046-0002a5d5c51b ==> 02:12:02 Characteristic b5f90077-aa8d-11e3-9046-0002a5d5c51b changed | value: 05:12:00:02:01:04 Received response on CQ_QUERY_RSP Received packet of length 5. 0 bytes remaining Received Query Response Wrote characteristic b5f90076-aa8d-11e3-9046-0002a5d5c51b Camera resolution is currently RES_2_7K Multiple Simultaneous Query Polls Rather than just polling one setting, it is also possible to poll multiple settings. An example of this is shown below. It is very similar to the previous example except that the query now includes 3 settings: Resolution, FPS, and FOV. python kotlin RESOLUTION_ID = 2 FPS_ID = 3 FOV_ID = 121 request = bytes([0x04, 0x12, RESOLUTION_ID, FPS_ID, FOV_ID]) await client.write_gatt_char(query_request_uuid.value, request, response=True) response = await received_responses.get() Wait to receive the notification response TODO The length (first byte of the query) has been increased to 4 to accommodate the extra settings We are also parsing the response to get all 3 values: python kotlin response.parse() logger.info(f\"Resolution is currently {Resolution(response.data[RESOLUTION_ID][0])}\") logger.info(f\"Video FOV is currently {VideoFOV(response.data[FOV_ID][0])}\") logger.info(f\"FPS is currently {FPS(response.data[FPS_ID][0])}\") TODO When we are storing the updated setting, we are just taking the first byte (i..e index 0). A real-world implementation would need to know the length (and type) of the setting / status response by the ID. For example, sometimes settings / statuses are bytes, words, strings, etc. They are then printed to the log which will look like the following: python kotlin Getting the current resolution, fps, and fov. Writing to GoProUuid.QUERY_REQ_UUID: 04:12:02:03:79 Received response at GoProUuid.QUERY_RSP_UUID: 0b:12:00:02:01:09:03:01:00:79:01:00 Received the Query Response Resolution is currently Resolution.RES_1080 Video FOV is currently VideoFOV.FOV_WIDE FPS is currently FPS.FPS_240 TODO In general, we can parse query values by looking at relevant documentation linked from the Setting or Status ID tables. For example (for settings): ID 2 == 9 equates to Resolution == 1080 ID 3 == 1 equates to FPS == 120 Query All It is also possible to query all settings / statuses by not passing any ID’s into the the query, i.e.: Query ID Request Query 0x12 Get All Settings 01:12 0x13 Get All Statuses 01:13 Quiz time! 📚 ✏️ How can we poll the encoding status and the resolution setting using one query? A: Concatenate a &8216;Get Setting Value&8217; query and a &8216;Get Status&8217; query with the relevant ID&8217;s B: Concatenate the &8216;Get All Setting&8217; and &8216;Get All Status&8217; queries. C: It is not possible Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. It is not possible to concatenate queries. This would result in an unknown sequence of bytes from the camera&8217;s perspective. So it is not possible to get a setting value and a status value in one query. The Get Setting Query (with resolution ID) and Get Status Query (with encoding ID) must be sent sequentially in order to get this information. Registering for Query Push Notifications Rather than polling the query information, it is also possible to use an interrupt scheme to register for push notifications when the relevant query information changes. The relevant queries are: Query ID Request Query 0x52 [Register updates for setting(s)](/OpenGoPro/ble/features/query.htmlregister-for-setting-value-updates) len:52:xx:xx 0x53 [Register updates for status(es)](/OpenGoPro/ble/features/query.htmlregister-for-status-value-updates) len:53:xx:xx 0x72 [Unregister updates for setting(s)](/OpenGoPro/ble/features/query.htmlunregister-for-setting-value-updates) len:72:xx:xx 0x73 [Unregister updates for status(es)](/OpenGoPro/ble/features/query.htmlunregister-for-status-value-updates) len:73:xx:xx where xx are setting / status ID(s) and len is the length of the rest of the query (the number of query bytes plus one for the request ID byte). The Query ID’s for push notification responses are as follows: Query ID Response 0x92 Setting Value Push Notification 0x93 Status Value Push Notification Here is a generic sequence diagram of how this looks (the same is true for statuses): GoProOpen GoPro user deviceGoProOpen GoPro user deviceConnected (steps from connect tutorial)loop[Setting changes]loop[Settingchanges]Register updates for settingNotification Response and Current Setting ValueSetting changesPush notification of new setting valueUnregister updates for settingNotification ResponseSetting changes That is, after registering for push notifications for a given query, notification responses will continuously be sent whenever the query changes until the client unregisters for push notifications for the given query. The initial response to the Register query also contains the current setting / status value. We will walk through an example of this below: First, let’s register for updates when the resolution setting changes: python kotlin query_request_uuid = GoProUuid.QUERY_REQ_UUID request = bytes([0x02, 0x52, RESOLUTION_ID]) await client.write_gatt_char(query_request_uuid.value, request, response=True) Wait to receive the notification response response = await received_responses.get() val registerResolutionUpdates = ubyteArrayOf(0x02U, 0x52U, RESOLUTION_ID) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_QUERY.uuid, registerResolutionUpdates) and parse its response (which includes the current resolution value). This is very similar to the polling example with the exception that the Query ID is now 0x52 (Register Updates for Settings). This can be seen in the raw byte data as well as by inspecting the response’s id property. python kotlin response.parse() resolution = Resolution(response.data[RESOLUTION_ID][0]) logger.info(f\"Resolution is currently {resolution}\") This will show in the log as such: Registering for resolution updates Writing to GoProUuid.QUERY_REQ_UUID: 02:52:02 Received response at GoProUuid.QUERY_RSP_UUID: 05:52:00:02:01:09 Received the Resolution Query response Successfully registered for resolution value updates Resolution is currently Resolution.RES_1080 val queryResponse = (receivedResponses.receive() as Response.Query).apply { parse() } resolution = Resolution.fromValue(queryResponse.data.getValue(RESOLUTION_ID).first()) This will show in the log as such: Registering for resolution value updates Writing characteristic b5f90076-aa8d-11e3-9046-0002a5d5c51b ==> 02:52:02 Wrote characteristic b5f90076-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90077-aa8d-11e3-9046-0002a5d5c51b changed | value: 05:52:00:02:01:04 Received response on CQ_QUERY_RSP Received packet of length 5. 0 bytes remaining Received Query Response Camera resolution is RES_2_7K We are now successfully registered for resolution value updates and will receive push notifications whenever the resolution changes. We verify this in the demo by then changing the resolution and waiting to receive the update. notification.. python kotlin target_resolution = Resolution.RES_2_7K if resolution is Resolution.RES_1080 else Resolution.RES_1080 request = bytes([0x03, 0x02, 0x01, target_resolution.value]) await client.write_gatt_char(setting_request_uuid.value, request, response=True) response = await received_responses.get() response.parse() while resolution is not target_resolution: request = bytes([0x02, 0x12, RESOLUTION_ID]) await client.write_gatt_char(query_request_uuid.value, request, response=True) response = await received_responses.get() Wait to receive the notification response response.parse() resolution = Resolution(response.data[RESOLUTION_ID][0]) This will show in the log as such: Changing the resolution to Resolution.RES_2_7K... Writing to GoProUuid.SETTINGS_REQ_UUID: 03:02:01:04 Received response at GoProUuid.SETTINGS_RSP_UUID: 02:02:00 Received Set Setting command response. Waiting to receive new resolution Received response at GoProUuid.QUERY_RSP_UUID: 05:92:00:02:01:04 Received the Resolution Query response Resolution is currently Resolution.RES_2_7K Resolution has changed as expected. Exiting... val targetResolution = if (resolution == Resolution.RES_2_7K) Resolution.RES_1080 else Resolution.RES_2_7K val setResolution = ubyteArrayOf(0x03U, RESOLUTION_ID, 0x01U, targetResolution.value) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_SETTING.uuid, setResolution) val setResolutionResponse = (receivedResponses.receive() as Response.Tlv).apply { parse() } // Verify we receive the update from the camera when the resolution changes while (resolution != targetResolution) { val queryNotification = (receivedResponses.receive() as Response.Query).apply { parse() } resolution = Resolution.fromValue(queryNotification.data.getValue(RESOLUTION_ID).first()) } We can see change happen in the log: Changing the resolution to RES_2_7K Writing characteristic b5f90074-aa8d-11e3-9046-0002a5d5c51b ==> 03:02:01:04 Wrote characteristic b5f90074-aa8d-11e3-9046-0002a5d5c51b Resolution successfully changed Waiting for camera to inform us about the resolution change Characteristic b5f90077-aa8d-11e3-9046-0002a5d5c51b changed | value: 05:92:00:02:01:04 Received response on b5f90077-aa8d-11e3-9046-0002a5d5c51b: 05:92:00:02:01:04 Received resolution query response Resolution is now RES_2_7K In this case, the Query ID is 0x92 (Setting Value Push Notification) as expected. Multiple push notifications can be registered / received in a similar manner that multiple queries were polled above Quiz time! 📚 ✏️ True or False: We can still poll a given query value while we are currently registered to receive push notifications for it. A: True B: False Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is A. While there is probably not a good reason to do so, there is nothing preventing polling in this manner. True or False: A push notification for a registered setting will only ever contain query information about one setting ID. A: True B: False Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is B. It is possible for push notifications to contain multiple setting ID&8217;s if both setting ID&8217;s have push notifications registered and both settings change at the same time. Troubleshooting See the first tutorial’s troubleshooting section. Good Job! Congratulations 🤙 You can now query any of the settings / statuses from the camera using one of the above patterns.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/ble-queries#"
+ },
+ {
+ "title": "Tutorial 5: BLE Protobuf Operations: ",
+ "excerpt": "This document will provide a walk-through tutorial to use the Open GoPro Interface to send and receive BLE Protobuf Data. Open GoPro uses Protocol Buffers Version 2 A list of Protobuf Operations can be found in the Protobuf ID Table. This tutorial only considers sending these as one-off operations. That is, it does not consider state management / synchronization when sending multiple operations. This will be discussed in a future lab. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. It is suggested that you have first completed the connect, sending commands, and parsing responses tutorials before going through this tutorial. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 5 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements Protobuf Example You can see some basic Protobuf usage, independent of a BLE connection, in the following script: $ python protobuf_example.py Set Turbo Mode You can test sending Set Turbo Mode to your camera through BLE using the following script: $ python set_turbo_mode.py See the help for parameter definitions: $ python set_turbo_mode.py --help usage: set_turbo_mode.py [-h] [-i IDENTIFIER] Connect to a GoPro camera, send Set Turbo Mode and parse the response options: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to Decipher Response Type TODO TODO Compiling Protobuf Files The Protobuf files used to compile source code for the Open GoPro Interface exist in the top-level protobuf directory of the Open GoPro repository. It is mostly out of the scope of these tutorials to describe how to compile these since this process is clearly defined in the per-language Protobuf Tutorial. For the purposes of these tutorials (and shared with the Python SDK), the Protobuf files are compiled using the Docker image defined in .admin/proto_build. This build process can be performed using make protos from the top level of this repo. This information is strictly explanatory. It is in no way necessary to (re)build the Protobuf files for these tutorials as the pre-compiled Protobuf source code already resides in the same directory as this tutorial’s example code. Working with Protobuf Messages Let’s first perform some basic serialization and deserialization of a Protobuf message. For this example, we are going to use the Set Turbo Transfer operation: Set Turbo Mode Documentation Per the documentation, this operation’s request payload should be serialized using the Protobuf message which can be found either in Documentation: RequestSetTurboActive documentation or source code: /** * Enable/disable display of \"Transferring Media\" UI * * Response: @ref ResponseGeneric */ message RequestSetTurboActive { required bool active = 1; // Enable or disable Turbo Transfer feature } This code can be found in protobuf_example.py Protobuf Message Example First let’s instantiate the request message by setting the active parameter and log the serialized bytes: Your IDE should show the Protobuf Message’s API signature since type stubs were generated when compiling the Protobuf files. python kotlin from tutorial_modules import proto request = proto.RequestSetTurboActive(active=False) logger.info(f\"Sending ==> {request}\") logger.info(request.SerializeToString().hex(\":\")) which will log as such: Sending ==> active: false 08:00 TODO We’re not going to analyze these bytes since it is the purpose of the Protobuf framework is to abstract this. However it is important to be able to generate the serialized bytes from the instantiated Protobuf Message object in order to send the bytes via BLE. Similarly, let’s now create a serialized response and show how to deserialize it into a ResponseGeneric object. python kotlin response_bytes = proto.ResponseGeneric(result=proto.EnumResultGeneric.RESULT_SUCCESS).SerializeToString() logger.info(f\"Received bytes ==> {response_bytes.hex(':')}\") response = proto.ResponseGeneric.FromString(response_bytes) logger.info(f\"Received ==> {response}\") which will log as such: Received bytes ==> 08:01 Received ==> result: RESULT_SUCCESS TODO We’re not hard-coding serialized bytes here since it may not be constant across Protobuf versions Performing a Protobuf Operation Now let’s actually perform a Protobuf Operation via BLE. First we need to discuss additional non-Protobuf-defined header bytes that are required for Protobuf Operations in the Open GoPro Interface. Protobuf Packet Format Besides having a compressed payload as defined per the Protobuf Specification, Open GoPro Protobuf operations also are identified by “Feature” and “Action” IDs. The top level message format (not including the standard headers) is as follows: Feature ID Action ID Serialized Protobuf Payload 1 Byte 1 Byte Variable Length This Feature / Action ID pair is used to identify the Protobuf Message that should be used to serialize / deserialize the payload. This mapping can be found in the Protobuf ID Table. Protobuf Response Parser Since the parsing of Protobuf messages is different than TLV Parsing, we need to create a ProtobufResponse class by extending the Response class from the TLV Parsing Tutorial. This ProtobufResponse parse method will: Extract Feature and Action ID’s Parse the Protobuf payload using the specified Protobuf Message python kotlin This code can be found in set_turbo_mode.py class ProtobufResponse(Response): ... def parse(self, proto: type[ProtobufMessage]) -> None: self.feature_id = self.raw_bytes[0] self.action_id = self.raw_bytes[1] self.data = proto.FromString(bytes(self.raw_bytes[2:])) TODO The accumulation process is the same for TLV and Protobuf responses so have not overridden the base Response class’s accumulation method and we are using the same notification handler as previous labs. Set Turbo Transfer Now let’s perform the Set Turbo Transfer operation and receive the response. First, we build the serialized byte request in the same manner as above), then prepend the Feature ID, Action ID, and length bytes: python kotlin turbo_mode_request = bytearray( [ 0xF1, Feature ID 0x6B, Action ID *proto.RequestSetTurboActive(active=False).SerializeToString(), ] ) turbo_mode_request.insert(0, len(turbo_mode_request)) TODO We then send the message, wait to receive the response, and parse the response using the Protobuf Message specified from the Set Turbo Mode Documentation: ResponseGeneric. python kotlin await client.write_gatt_char(request_uuid.value, turbo_mode_request, response=True) response = await received_responses.get() response.parse(proto.ResponseGeneric) assert response.feature_id == 0xF1 assert response.action_id == 0xEB logger.info(response.data) which will log as such: Setting Turbo Mode Off Writing 04:f1:6b:08:00 to GoProUuid.COMMAND_REQ_UUID Received response at UUID GoProUuid.COMMAND_RSP_UUID: 04:f1:eb:08:01 Set Turbo Mode response complete received. Successfully set turbo mode result: RESULT_SUCCESS TODO Deciphering Response Type This same procedure is used for all Protobuf Operations. Coupled with the information from previous tutorials, you are now capable of parsing any response received from the GoPro. However we have not yet covered how to decipher the response type: Command, Query, Protobuf, etc. The algorithm to do so is defined in the GoPro BLE Spec and reproduced here for reference: Message Deciphering Algorithm Response Manager We’re now going to create a monolithic ResponseManager class to implement this algorithm to perform (at least initial) parsing of all response types: python kotlin The sample code below is taken from decipher_response.py The ResponseManager is a wrapper around a BleakClient to manage accumulating, parsing, and retrieving responses. First, let’s create a non-initialized response manager, connect to get a BleakClient and initialize the manager by setting the client: manager = ResponseManager() manager.set_client(await connect_ble(manager.notification_handler, identifier)) Then, in the notification handler, we “decipher” the response before enqueueing it to the received response queue: async def notification_handler(self, characteristic: BleakGATTCharacteristic, data: bytearray) -> None: uuid = GoProUuid(self.client.services.characteristics[characteristic.handle].uuid) logger.debug(f'Received response at {uuid}: {data.hex(\":\")}') response = self._responses_by_uuid[uuid] response.accumulate(data) Enqueue if we have received the entire response if response.is_received: await self._q.put(self.decipher_response(response)) Reset the accumulating response self._responses_by_uuid[uuid] = Response(uuid) where “deciphering” is the implementation of the above algorithm: def decipher_response(self, undeciphered_response: Response) -> ConcreteResponse: payload = undeciphered_response.raw_bytes Are the first two payload bytes a real Fetaure / Action ID pair? if (index := ProtobufId(payload[0], payload[1])) in ProtobufIdToMessage: if not (proto_message := ProtobufIdToMessage.get(index)): We've only added protobuf messages for operations used in this tutorial. raise RuntimeError( f\"{index} is a valid Protobuf identifier but does not currently have a defined message.\" ) else: Now use the protobuf messaged identified by the Feature / Action ID pair to parse the remaining payload response = ProtobufResponse.from_received_response(undeciphered_response) response.parse(proto_message) return response TLV. Should it be parsed as Command or Query? if undeciphered_response.uuid is GoProUuid.QUERY_RSP_UUID: It's a TLV query response = QueryResponse.from_received_response(undeciphered_response) else: It's a TLV command / setting. response = TlvResponse.from_received_response(undeciphered_response) Parse the TLV payload (query, command, or setting) response.parse() return response Only the minimal functionality needed for these tutorials have been added. For example, many Protobuf Feature / Action ID pairs do not have corresponding Protobuf Messages defined. TODO After deciphering, the parsed method is placed in the response queue as a either a TlvResponse, QueryResponse, or ProtobufResponse. Examples of Each Response Type Now let’s perform operations that will demonstrate each response type: python kotlin TLV Command (Setting) await set_resolution(manager) TLV Command await get_resolution(manager) TLV Query await set_shutter_off(manager) Protobuf await set_turbo_mode(manager) These four methods will perform the same functionality we’ve demonstrated in previous tutorials, now using our ResponseManager. We’ll walk through the get_resolution method here. First build the request and send it: request = bytes([0x03, 0x02, 0x01, 0x09]) request_uuid = GoProUuid.SETTINGS_REQ_UUID await manager.client.write_gatt_char(request_uuid.value, request, response=True) Then retrieve the response from the manager: tlv_response = await manager.get_next_response_as_tlv() logger.info(f\"Set resolution status: {tlv_response.status}\") This logs as such: Getting the current resolution Writing to GoProUuid.QUERY_REQ_UUID: 02:12:02 Received response at GoProUuid.QUERY_RSP_UUID: 05:12:00:02:01:09 Received current resolution: Resolution.RES_1080 Note that each example retrieves the parsed response from the manager via one of the following methods: get_next_response_as_tlv get_next_response_as_query get_next_response_as_response These are functionally the same as they just retrieve the next received response from the manager’s queue and only exist as helpers to simplify typing. TODO Troubleshooting See the first tutorial’s troubleshooting section. Good Job! Congratulations 🤙 You can now accumulate, decipher, and parse any BLE response received from the GoPro.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/ble-protobuf#"
+ },
+ {
+ "title": "Tutorial 6: Connect WiFi: ",
+ "excerpt": "This document will provide a walk-through tutorial to use the Open GoPro Interface to connect the GoPro to a Wifi network either in Access Point (AP) mode or Station (STA) Mode. It is recommended that you have first completed the connecting BLE, sending commands, parsing responses, and protobuf tutorials before proceeding. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. The scripts that will be used for this tutorial can be found in the Tutorial 6 Folder. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 6 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements Enable WiFi AP You can enable the GoPro’s Access Point to allow it accept Wifi connections as an Access Point via: $ python wifi_enable.py See the help for parameter definitions: $ python wifi_enable.py --help usage: enable_wifi_ap.py [-h] [-i IDENTIFIER] [-t TIMEOUT] Connect to a GoPro camera via BLE, get its WiFi Access Point (AP) info, and enable its AP. options: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to -t TIMEOUT, --timeout TIMEOUT time in seconds to maintain connection before disconnecting. If not set, will maintain connection indefinitely Connect GoPro as STA You can connect the GoPro to a Wifi network where the GoPro is in Station Mode (STA) via: $ python connect_as_sta.py See the help for parameter definitions: $ python connect_as_sta.py --help Connect the GoPro to a Wifi network where the GoPro is in Station Mode (STA). positional arguments: ssid SSID of network to connect to password Password of network to connect to options: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 6” from the dropdown and click on “Perform.” This requires that a GoPro is already connected via BLE, i.e. that Tutorial 1 was already run. You can check the BLE status at the top of the app. Perform Tutorial 6 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Setup For both cases, we must first connect to BLE as was discussed in the connecting BLE tutorial. Access Point Mode (AP) In AP mode, the GoPro operates as an Access Point, allowing wireless clients to connect and communicate using the Open GoPro HTTP API. The HTTP API provides much of the same functionality as the BLE API as well as some additional functionality. For more information on the HTTP API, see the next 2 tutorials. AccessPointGoProclientBLEWiFi In order to connect to the camera in AP mode, after connecting via BLE, pairing, and enabling notifications, we must: find the GoPro’s WiFi AP information (SSID and password) via BLE, enable the WiFi AP via BLE connect to the WiFi AP. Here is an outline of the steps to do so: GoProWiFiGoProBLEOpen GoPro user deviceGoProWiFiGoProBLEOpen GoPro user deviceScanningConnectedalt[If not Previously Paired]PairedReady to Communicateloop[Steps from Connect Tutorial]WiFi AP enabledAdvertisingAdvertisingConnectPair RequestPair ResponseEnable Notifications on Characteristic 1Enable Notifications on Characteristic 2Enable Notifications on Characteristic ..Enable Notifications on Characteristic NRead Wifi AP SSIDRead Wifi AP PasswordWrite to Enable WiFi APResponse sent as notificationConnect to WiFi AP The following subsections will detail this process. Find WiFi Information First we must find the target Wifi network’s SSID and password. The process to get this information is different than all other BLE operations described up to this point. Whereas the previous command, setting, and query operations all followed the Write Request-Notification Response pattern, the WiFi Information is retrieved via direct Read Requests to BLE characteristics. Get WiFi SSID The WiFi SSID can be found by reading from the WiFi AP SSID characteristic of the WiFi Access Point service. Let’s send the read request to get the SSID and decode it into a string. python kotlin ssid_uuid = GoProUuid.WIFI_AP_SSID_UUID logger.info(f\"Reading the WiFi AP SSID at {ssid_uuid}\") ssid = (await client.read_gatt_char(ssid_uuid.value)).decode() logger.info(f\"SSID is {ssid}\") There is no need for a synchronization event as the information is available when the read_gatt_char method returns. In the demo, this information is logged as such: Reading the WiFi AP SSID at GoProUuid.WIFI_AP_SSID_UUID SSID is GP24500702 ble.readCharacteristic(goproAddress, GoProUUID.WIFI_AP_SSID.uuid).onSuccess { ssid = it.decodeToString() } Timber.i(\"SSID is $ssid\") In the demo, this information is logged as such: Getting the SSID Read characteristic b5f90002-aa8d-11e3-9046-0002a5d5c51b : value: 64:65:62:75:67:68:65:72:6F:31:31 SSID is debughero11 Get WiFi Password The WiFi password can be found by reading from the WiFi AP password characteristic of the WiFi Access Point service. Let’s send the read request to get the password and decode it into a string. python kotlin password_uuid = GoProUuid.WIFI_AP_PASSWORD_UUID logger.info(f\"Reading the WiFi AP password at {password_uuid}\") password = (await client.read_gatt_char(password_uuid.value)).decode() logger.info(f\"Password is {password}\") There is no need for a synchronization event as the information is available when the read_gatt_char method returns. In the demo, this information is logged as such: Reading the WiFi AP password at GoProUuid.WIFI_AP_PASSWORD_UUID Password is p@d-NNc-2ts ble.readCharacteristic(goproAddress, GoProUUID.WIFI_AP_PASSWORD.uuid).onSuccess { password = it.decodeToString() } Timber.i(\"Password is $password\") In the demo, this information is logged as such: Getting the password Read characteristic b5f90003-aa8d-11e3-9046-0002a5d5c51b : value: 7A:33:79:2D:44:43:58:2D:50:68:6A Password is z3y-DCX-Phj Enable WiFi AP Before we can connect to the WiFi AP, we have to make sure the access point is enabled. This is accomplished via the AP Control command: Command Bytes Ap Control Enable 0x03 0x17 0x01 0x01 Ap Control Disable 0x03 0x17 0x01 0x00 We are using the same notification handler that was defined in the sending commands tutorial. Let’s write the bytes to the “Command Request UUID” to enable the WiFi AP! python kotlin event.clear() request = bytes([0x03, 0x17, 0x01, 0x01]) command_request_uuid = GoProUuid.COMMAND_REQ_UUID await client.write_gatt_char(command_request_uuid.value, request, response=True) await event.wait() Wait to receive the notification response We make sure to clear the synchronization event before writing, then pend on the event until it is set in the notification callback. val enableWifiCommand = ubyteArrayOf(0x03U, 0x17U, 0x01U, 0x01U) ble.writeCharacteristic(goproAddress, GoProUUID.CQ_COMMAND.uuid, enableWifiCommand) receivedData.receive() Note that we have received the “Command Status” notification response from the Command Response characteristic since we enabled it’s notifications in Enable Notifications. This can be seen in the demo log: python kotlin Enabling the WiFi AP Writing to GoProUuid.COMMAND_REQ_UUID: 03:17:01:01 Received response at GoProUuid.COMMAND_RSP_UUID: 02:17:00 Command sent successfully WiFi AP is enabled Enabling the camera's Wifi AP Writing characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b ==> 03:17:01:01 Wrote characteristic b5f90072-aa8d-11e3-9046-0002a5d5c51b Characteristic b5f90073-aa8d-11e3-9046-0002a5d5c51b changed | value: 02:17:00 Received response on b5f90073-aa8d-11e3-9046-0002a5d5c51b: 02:17:00 Command sent successfully As expected, the response was received on the correct UUID and the status was “success”. Establish Connection to WiFi AP python kotlin If you have been following through the ble_enable_wifi.py script, you will notice that it ends here such that we know the WiFi SSID / password and the WiFi AP is enabled. This is because there are many different methods of connecting to the WiFi AP depending on your OS and the framework you are using to develop. You could, for example, simply use your OS’s WiFi GUI to connect. While out of the scope of these tutorials, there is a programmatic example of this in the cross-platform WiFi Demo from the Open GoPro Python SDK. Using the passwsord and SSID we discovered above, we will now connect to the camera’s network: wifi.connect(ssid, password) This should show a system popup on your Android device that eventually goes away once the Wifi is connected. This connection process appears to vary drastically in time. Quiz time! 📚 ✏️ How is the WiFi password response received? A: As a read response from the WiFi AP Password characteristic B: As write responses to the WiFi Request characteristic C: As notifications of the Command Response characteristic Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is A. This (and WiFi AP SSID) is an exception to the rule. Usually responses are received as notifications to a response characteristic. However, in this case, it is received as a direct read response (since we are reading from the characteristic and not writing to it). Which of the following statements about the GoPro WiFi AP is true? A: It only needs to be enabled once and it will then always remain on B: The WiFi password will never change C: The WiFi SSID will never change D: None of the Above Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is D. While the WiFi AP will remain on for some time, it can and will eventually turn off so it is always recommended to first connect via BLE and ensure that it is enabled. The password and SSID will almost never change. However, they will change if the connections are reset via Connections->Reset Connections. You are now connected to the GoPro’s Wifi AP and can send any of the HTTP commands defined in the HTTP Specification. Station (STA) Mode Station Mode is where the GoPro operates as a Station, allowing the camera to connect to and communicate with an Access Point such as a switch or a router. This is used, for example, in the livestreaming and camera on the home network (COHN) features. StationAccessPointGoProrouterclientBLEWifi When the GoPro is in Station Mode, there is no HTTP communication channel to the Open GoPro client. The GoPro can still be controlled via BLE. In order to configure the GoPro in Station mode, after connecting via BLE, pairing, and enabling notifications, we must: scan for available networks connect to a discovered network, using the correct API based on whether or not we have previously connected to this network The following subsections will detail these steps. All of the Protobuf operations are performed in the same manner as in the protobuf tutorial such as reusing the ResponseManager. Scan for Networks It is always necessary to scan for networks, regardless of whether you already have a network’s information and know it is available. Failure to do so follows an untested and unsupported path in the GoPro’s connection state machine. The process of scanning for networks requires several Protobuf Operations as summarized here: Scan For Networks First we must request the GoPro to Scan For Access Points: python kotlin The code here is taken from connect_as_sta.py Let’s send the scan request and then retrieve and parse notifications until we receive a notification where the scanning_state is set to SCANNING_SUCCESS. Then we store the scan id from the notification for later use in retrieving the scan results. start_scan_request = bytearray( [ 0x02, Feature ID 0x02, Action ID *proto.RequestStartScan().SerializePartialToString(), ] ) start_scan_request.insert(0, len(start_scan_request)) await manager.client.write_gatt_char(GoProUuid.NETWORK_MANAGEMENT_REQ_UUID.value, start_scan_request, response=True) while response := await manager.get_next_response_as_protobuf(): ... elif response.action_id == 0x0B: Scan Notifications scan_notification: proto.NotifStartScanning = response.data type: ignore logger.info(f\"Received scan notification: {scan_notification}\") if scan_notification.scanning_state == proto.EnumScanning.SCANNING_SUCCESS: return scan_notification.scan_id This will log as such: Scanning for available Wifi Networks Writing: 02:02:02 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 06:02:82:08:01:10:02 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 0a:02:0b:08:05:10:01:18:05:20:01 Received scan notification: scanning_state: SCANNING_SUCCESS scan_id: 1 total_entries: 5 total_configured_ssid: 1 TODO Next we must request the GoPro to return the Scan Results. Using the scan_id from above, let’s send the Get AP Scan Results request, then retrieve and parse the response: python kotlin results_request = bytearray( [ 0x02, Feature ID 0x03, Action ID *proto.RequestGetApEntries(start_index=0, max_entries=100, scan_id=scan_id).SerializePartialToString(), ] ) results_request.insert(0, len(results_request)) await manager.client.write_gatt_char(GoProUuid.NETWORK_MANAGEMENT_REQ_UUID.value, results_request, response=True) response := await manager.get_next_response_as_protobuf(): entries_response: proto.ResponseGetApEntries = response.data type: ignore logger.info(\"Found the following networks:\") for entry in entries_response.entries: logger.info(str(entry)) return list(entries_response.entries) This will log as such: Getting the scanned networks. Writing: 08:02:03:08:00:10:64:18:01 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 20:76:02:83:08:01:10:01:1a:13:0a:0a:64:61:62:75:67:64:61:62 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 80:75:67:10:03:20:e4:28:28:2f:1a:13:0a:0a:41:54:54:54:70:34 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 81:72:36:46:69:10:02:20:f1:2c:28:01:1a:13:0a:0a:41:54:54:62 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 82:37:4a:67:41:77:61:10:02:20:99:2d:28:01:1a:16:0a:0d:52:69 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 83:6e:67:20:53:65:74:75:70:20:65:37:10:01:20:ec:12:28:00:1a Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 84:17:0a:0e:48:6f:6d:65:79:6e:65:74:5f:32:47:45:58:54:10:01 Received response at GoProUuid.NETWORK_MANAGEMENT_RSP_UUID: 85:20:85:13:28:01 Found the following networks: ssid: \"dabugdabug\" signal_strength_bars: 3 signal_frequency_mhz: 5220 scan_entry_flags: 47 ssid: \"ATTTp4r6Fi\" signal_strength_bars: 2 signal_frequency_mhz: 5745 scan_entry_flags: 1 ssid: \"ATTb7JgAwa\" signal_strength_bars: 2 signal_frequency_mhz: 5785 scan_entry_flags: 1 ssid: \"Ring Setup e7\" signal_strength_bars: 1 signal_frequency_mhz: 2412 scan_entry_flags: 0 ssid: \"Homeynet_2GEXT\" signal_strength_bars: 1 signal_frequency_mhz: 2437 scan_entry_flags: 1 TODO At this point we have all of the discovered networks. Continue on to see how to use this information. Connect to Network Depending on whether the GoPro has already connected to the desired network, we must next perform either the Connect or Connect New operation. This will be described below but first, a note on fragmentation: GATT Write Fragmentation Up to this point in the tutorials, all of the operations we have been performing have resulted in GATT write requests guaranteed to be less than maximum BLE packet size of 20 bytes. However, depending on the SSID and password used in the Connect New operation, this maximum size might be surpassed. Therefore, it is necessary to fragment the payload. This is essentially the inverse of the accumulation algorithm. We accomplish this as follows: python kotlin Let’s create a generator to yield fragmented packets (yield_fragmented_packets) from a monolithic payload. First, depending on the length of the payload, we create the header for the first packet that specifies the total payload length: if length < (2**5 - 1): header = bytearray([length]) elif length < (2**13 - 1): header = bytearray((length | 0x2000).to_bytes(2, \"big\", signed=False)) elif length < (2**16 - 1): header = bytearray((length | 0x6400).to_bytes(2, \"big\", signed=False)) Then we chunk through the payload, prepending either the above header for the first packet or the continuation header for subsequent packets: byte_index = 0 while bytes_remaining := length - byte_index: If this is the first packet, use the appropriate header. Else use the continuation header if is_first_packet: packet = bytearray(header) is_first_packet = False else: packet = bytearray(CONTINUATION_HEADER) Build the current packet packet_size = min(MAX_PACKET_SIZE - len(packet), bytes_remaining) packet.extend(bytearray(payload[byte_index : byte_index + packet_size])) yield bytes(packet) Increment byte_index for continued processing byte_index += packet_size Finally we create a helper method that we can reuse throughout the tutorials to use this generator to send GATT Writes using a given Bleak client: async def fragment_and_write_gatt_char(client: BleakClient, char_specifier: str, data: bytes): for packet in yield_fragmented_packets(data): await client.write_gatt_char(char_specifier, packet, response=True) TODO The safest solution would be to always use the above fragmentation method. For the sake of simplicity in these tutorials, we are only using this where there is a possibility of exceeding the maximum BLE packet size. Connect Example In order to proceed, we must first inspect the scan result gathered from the previous section to see which connect operation to use. Specifically we are checking the scan_entry_flags to see if the SCAN_FLAG_CONFIGURED bit is set. If the bit is set (and thus we have already provisioned this network) then we must use Connect . Otherwise we must use Connect New: python kotlin if entry.scan_entry_flags & proto.EnumScanEntryFlags.SCAN_FLAG_CONFIGURED: connect_request = bytearray( [ 0x02, Feature ID 0x04, Action ID *proto.RequestConnect(ssid=entry.ssid).SerializePartialToString(), ] ) else: connect_request = bytearray( [ 0x02, Feature ID 0x05, Action ID *proto.RequestConnectNew(ssid=entry.ssid, password=password).SerializePartialToString(), ] ) TODO Now that we have the correct request built, we can send it (using our newly created fragmentation method) we can send it. Then we will continuously receive Provisioning Notifications which should be checked until the provisioning_state is set to PROVISIONING_SUCCESS_NEW_AP. The final provisioning_state that we are looking for is always PROVISIONING_SUCCESS_NEW_AP both in the Connect and Connect New use cases. The procedure is summarized here: Connect to Already Configured Network python kotlin await fragment_and_write_gatt_char(manager.client, GoProUuid.NETWORK_MANAGEMENT_REQ_UUID.value, connect_request) while response := await manager.get_next_response_as_protobuf(): ... elif response.action_id == 0x0C: NotifProvisioningState Notifications provisioning_notification: proto.NotifProvisioningState = response.data type: ignore if provisioning_notification.provisioning_state == proto.EnumProvisioning.PROVISIONING_SUCCESS_NEW_AP: return TODO At this point, the GoPro is connect to the desired network in Station Mode! Quiz time! 📚 ✏️ True or False: When the GoPro is in Station Mode, it can be communicated with via both BLE and HTTP. A: True B: False Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is B. When the GoPro is in station mode, it is connected via WiFi to another Access Point; not connected via Wifi to you (the client). However, it is possible to maintain the BLE connection in STA mode so that you can still control the GoPro. Troubleshooting See the first tutorial’s BLE troubleshooting section to troubleshoot BLE problems. Good Job! Congratulations 🤙 You have now connected the GoPro to a WiFi network in either AP or STA mode. To see how to make use of AP mode, continue to the next tutorial. To see how make use of STA mode, continue to the camera on the home network tutorial.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/connect-wifi#"
+ },
+ {
+ "title": "Tutorial 7: Send WiFi Commands: ",
+ "excerpt": "This document will provide a walk-through tutorial to perform Open GoPro HTTP Operations with the GoPro. It is suggested that you have first completed the Connecting to Wifi tutorial. This tutorial only considers sending these commands as one-off commands. That is, it does not consider state management / synchronization when sending multiple commands. This will be discussed in a future tutorial. There are two types of responses that can be received from the HTTP commands: JSON and binary. This section will deal with commands that return JSON responses. For commands with binary responses (as well as commands with JSON responses that work with the media list), see the next tutorial. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. The scripts that will be used for this tutorial can be found in the Tutorial 7 Folder. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 7 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements You must be connected to the camera via WiFi as stated in Tutorial 5. Get Camera State You can test querying the state of your camera with HTTP over WiFi using the following script: $ python wifi_command_get_state.py See the help for parameter definitions: $ python wifi_command_get_state.py --help usage: wifi_command_get_state.py [-h] Get the state of the GoPro (status and settings). optional arguments: -h, --help show this help message and exit Preview Stream You can test enabling the UDP preview stream with HTTP over WiFi using the following script: $ python wifi_command_preview_stream.py See the help for parameter definitions: $ python wifi_command_preview_stream.py --help usage: wifi_command_preview_stream.py [-h] Enable the preview stream. optional arguments: -h, --help show this help message and exit Once enabled the stream can be viewed at udp://@:8554 (For more details see the View Stream tab in the Preview Stream section below. Load Preset Group You can test sending the load preset group command with HTTP over WiFi using the following script: $ python wifi_command_load_group.py See the help for parameter definitions: $ python wifi_command_load_group.py --help usage: wifi_command_load_group.py [-h] Load the video preset group. optional arguments: -h, --help show this help message and exit Set Shutter You can test sending the Set Shutter command with HTTP over WiFi using the following script: $ python wifi_command_set_shutter.py See the help for parameter definitions: $ python wifi_command_set_shutter.py --help usage: wifi_command_set_shutter.py [-h] Take a 3 second video. optional arguments: -h, --help show this help message and exit Set Setting You can test setting the resolution setting with HTTP over WiFi using the following script: $ python wifi_command_set_resolution.py See the help for parameter definitions: $ python wifi_command_set_resolution.py --help usage: wifi_command_set_resolution.py [-h] Set the video resolution to 1080. optional arguments: -h, --help show this help message and exit The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 7” from the dropdown and click on “Perform.” This requires: a GoPro is already connected via BLE, i.e. that Tutorial 1 was already run. a GoPro is already connected via Wifi, i.e. that Tutorial 5 was already run. You can check the BLE and Wifi statuses at the top of the app. Perform Tutorial 7 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Setup We must first connect to The GoPro’s WiFi Access Point (AP) as was discussed in the Connecting to Wifi tutorial. Sending HTTP Commands with JSON Responses Now that we are are connected via WiFi, we can communicate via HTTP commands. python kotlin We will use the requests package to send the various HTTP commands. We are building the endpoints using the GOPRO_BASE_URL defined in the tutorial package’s __init__.py We are using ktor for the HTTP client. We are using an abstracted get function from our Wifi class to send get requests as such: private val client by lazy { HttpClient(CIO) { install(HttpTimeout) } } suspend fun get(endpoint: String, timeoutMs: Long = 5000L): JsonObject { Timber.d(\"GET request to: $endpoint\") val response = client.request(endpoint) { timeout { requestTimeoutMillis = timeoutMs } } val bodyAsString: String = response.body() return prettyJson.parseToJsonElement(bodyAsString).jsonObject } Both Command Requests and Setting Requests follow the same procedure: Send HTTP GET command to appropriate endpoint Receive confirmation from GoPro (via HTTP response) that request was received. GoPro reacts to command The HTTP response only indicates that the request was received correctly. The relevant behavior of the GoPro must be observed to verify when the command’s effects have been applied. GoProOpen GoPro user deviceGoProOpen GoPro user devicePC connected to WiFi APCommand Request (GET)Command Response (HTTP 200 OK)Apply affects of command when able Get Camera State The first command we will be sending is Get Camera State. This command will return all of the current settings and values. It is basically a combination of the Get All Settings and Get All Statuses commands that were sent via BLE. Since there is no way to query individual settings / statuses via WiFi (or register for asynchronous notifications when they change), this is the only option to query setting / status information via WiFi. The command writes to the following endpoint: /gopro/camera/state Let’s build the endpoint then perform the GET operation and check the response for errors. Any errors will raise an exception. python kotlin url = GOPRO_BASE_URL + \"/gopro/camera/state\" response = requests.get(url) response.raise_for_status() var response = wifi.get(GOPRO_BASE_URL + \"gopro/camera/state\") Lastly, we print the response’s JSON data: python kotlin logger.info(f\"Response: {json.dumps(response.json(), indent=4)}\") The response will log as such (abbreviated for brevity): INFO:root:Getting GoPro's status and settings: sending http://10.5.5.9:8080/gopro/camera/state INFO:root:Command sent successfully INFO:root:Response: { \"status\": { \"1\": 1, \"2\": 2, \"3\": 0, \"4\": 255, \"6\": 0, \"8\": 0, \"9\": 0, \"10\": 0, \"11\": 0, \"13\": 0, \"14\": 0, \"17\": 1, ... \"settings\": { \"2\": 9, \"3\": 1, \"5\": 0, \"6\": 1, \"13\": 1, \"19\": 0, \"24\": 0, \"30\": 0, \"31\": 0, \"32\": 10, \"41\": 9, \"42\": 5, Timber.i(prettyJson.encodeToString(response)) The response will log as such (abbreviated for brevity): Getting camera state GET request to: http://10.5.5.9:8080/gopro/camera/state { \"status\": { \"1\": 1, \"2\": 4, \"3\": 0, \"4\": 255, \"6\": 0, \"8\": 0, \"9\": 0, \"10\": 0, \"11\": 0, \"13\": 0, ... \"113\": 0, \"114\": 0, \"115\": 0, \"116\": 0, \"117\": 31154688 }, \"settings\": { \"2\": 9, \"3\": 1, \"5\": 0, \"6\": 1, \"13\": 1, ... \"177\": 0, \"178\": 1, \"179\": 3, \"180\": 0, \"181\": 0 } } We can see what each of these values mean by looking at relevant documentation in the settings or status object of the State schema. For example (for settings): ID 2 == 9 equates to Resolution == 1080 ID 3 == 1 equates to FPS == 120 Load Preset Group The next command we will be sending is Load Preset Group, which is used to toggle between the 3 groups of presets (video, photo, and timelapse). The preset groups ID’s are: Command Bytes Load Video Preset Group 1000 Load Photo Preset Group 1001 Load Timelapse Preset Group 1002 python kotlin url = GOPRO_BASE_URL + \"/gopro/camera/presets/set_group?id=1000\" response = requests.get(url) response.raise_for_status() response = wifi.get(GOPRO_BASE_URL + \"gopro/camera/presets/load?id=1000\") Lastly, we print the response’s JSON data: python kotlin logger.info(f\"Response: {json.dumps(response.json(), indent=4)}\") This will log as such: INFO:root:Loading the video preset group: sending http://10.5.5.9:8080/gopro/camera/presets/set_group?id=1000 INFO:root:Command sent successfully INFO:root:Response: {} Timber.i(prettyJson.encodeToString(response)) The response will log as such: Loading Video Preset Group GET request to: http://10.5.5.9:8080/gopro/camera/presets/load?id=1000 { } Lastly, we print the response’s JSON data: The response JSON is empty. This is expected in the case of a success. You should hear the camera beep and switch to the Cinematic Preset (assuming it wasn’t already set). You can verify this by seeing the preset name in the pill at bottom middle of the screen. Load Preset Set Shutter The next command we will be sending is Set Shutter. which is used to start and stop encoding. python kotlin url = GOPRO_BASE_URL + f\"/gopro/camera/shutter/start\" response = requests.get(url) response.raise_for_status() response = wifi.get(GOPRO_BASE_URL + \"gopro/camera/shutter/start\") Lastly, we print the response’s JSON data: This command does not return a JSON response so we don’t print the response This will log as such: python kotlin INFO:root:Turning the shutter on: sending http://10.5.5.9:8080/gopro/camera/shutter/start INFO:root:Command sent successfully Timber.i(prettyJson.encodeToString(response)) The response will log as such: Setting Shutter On GET request to: http://10.5.5.9:8080/gopro/camera/shutter/start { } We can then wait a few seconds and repeat the above procedure to set the shutter off using gopro/camera/shutter/stop. The shutter can not be set on if the camera is encoding or set off if the camera is not encoding. An attempt to do so will result in an error response. Set Setting The next command will be sending is Set Setting. This end point is used to update all of the settings on the camera. It is analogous to BLE commands like Set Video Resolution. It is important to note that many settings are dependent on the video resolution (and other settings). For example, certain FPS values are not valid with certain resolutions. In general, higher resolutions only allow lower FPS values. Check the camera capabilities to see which settings are valid for given use cases. Let’s build the endpoint first to set the Video Resolution to 1080 (the setting_id and option value comes from the command table linked above). python kotlin url = GOPRO_BASE_URL + f\"/gopro/camera/setting?setting=2&option=9\" response = requests.get(url) response.raise_for_status() response = wifi.get(GOPRO_BASE_URL + \"gopro/camera/setting?setting=2&option=9\") Lastly, we print the response’s JSON data: python kotlin logger.info(f\"Response: {json.dumps(response.json(), indent=4)}\") This will log as such: INFO:root:Setting the video resolution to 1080: sending http://10.5.5.9:8080/gopro/camera/setting?setting_id=2&opt_value=9 INFO:root:Command sent successfully INFO:root:Response: {} Timber.i(prettyJson.encodeToString(response)) The response will log as such: Setting Resolution to 1080 GET request to: http://10.5.5.9:8080/gopro/camera/setting?setting=2&option=9 { } The response JSON is empty. This is expected in the case of a success. You should hear the camera beep and see the video resolution change to 1080 in the pill in the bottom-middle of the screen: Video Resolution As a reader exercise, try using the Get Camera State command to verify that the resolution has changed. Preview Stream The next command we will be sending is Start Preview Stream. This command will enable (or disable) the preview stream . It is then possible to view the preview stream from a media player. The commands write to the following endpoints: Command Endpoint start preview stream /gopro/camera/stream/start stop preview stream /gopro/camera/stream/stop Let’s build the endpoint then send the GET request and check the response for errors. Any errors will raise an exception. python kotlin url = GOPRO_BASE_URL + \"/gopro/camera/stream/start\" response = requests.get(url) response.raise_for_status() TODO Lastly, we print the response’s JSON data: python kotlin logger.info(f\"Response: {json.dumps(response.json(), indent=4)}\") This will log as such: INFO:root:Starting the preview stream: sending http://10.5.5.9:8080/gopro/camera/stream/start INFO:root:Command sent successfully INFO:root:Response: {} TODO The response JSON is empty. This is expected in the case of a success. Once enabled, the stream can be viewed at udp://@:8554. Here is an example of viewing this using VLC: The screen may slightly vary depending on your OS Select Media–>Open Network Stream Enter the path as such: Configure Preview Stream Select play The preview stream should now be visible. Quiz time! 📚 ✏️ What is the significance of empty JSON in an HTTP response? A: Always an error! The command was not received correctly. B: If the status is ok (200), this is expected. C: This is expected for errors (code other than 200) but not expected for ok (200). Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is B. It is common for the JSON response to be empty if the command was received successfully but there is no additional information to return at the current time. Which of the of the following is not a real preset group? A: Timelapse B: Photo C: Burst D: Video Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is C. There are 3 preset groups (Timelapse, Photo, and Video). These can be set via the Load Preset Group command. How do you query the current video resolution setting (id = 2) via WiFi? A: Send GET to /gopro/camera/state?setting_id=2 B: Send GET to /gopro/camera/state?get_setting=2 C: Send POST to /gopro/camera/state with request &8216;setting_id=2&8217; D: None of the Above Submit Answer Correct!! 😃 Incorrect!! 😭 The correct answer is D. You can&8217;t query individual settings or statuses with the HTTP API. In order to get the value of a specific setting you&8217;ll need to send a GET to /gopro/camera/state and parse the value of the desired setting from the JSON response. Troubleshooting HTTP Logging Wireshark can be used to view the HTTP commands and responses between the PC and the GoPro. Start a Wireshark capture on the WiFi adapter that is used to connect to the GoPro Filter for the GoPro IP address (10.5.5.9) Wireshark Good Job! Congratulations 🤙 You can now send any of the HTTP commands defined in the Open GoPro Interface that return JSON responses. You may have noted that we did not discuss one of these (Get Media List) in this tutorial. Proceed to the next tutorial to see how to get and perform operations using the media list.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/send-wifi-commands#"
+ },
+ {
+ "title": "Tutorial 8: Camera Media List: ",
+ "excerpt": "This document will provide a walk-through tutorial to send Open GoPro HTTP commands to the GoPro, specifically to get the media list and perform operations on it (downloading pictures, videos, etc.) It is suggested that you have first completed the Connecting to Wifi and Sending WiFi Commands tutorials. This tutorial only considers sending these commands as one-off commands. That is, it does not consider state management / synchronization when sending multiple commands. This will be discussed in a future tutorial. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. The scripts that will be used for this tutorial can be found in the Tutorial 8 Folder. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 8 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements You must be connected to the camera via WiFi in order to run these scripts. You can do this by manually to the SSID and password listed on your camera or by leaving the Establish Connection to WiFi AP script from Tutorial 5 running in the background. Download Media File You can downloading a file from your camera with HTTP over WiFi using the following script: $ python wifi_media_download_file.py See the help for parameter definitions: $ python wifi_media_download_file.py --help usage: wifi_media_download_file.py [-h] Find a photo on the camera and download it to the computer. optional arguments: -h, --help show this help message and exit Get Media Thumbnail You can downloading the thumbnail for a media file from your camera with HTTP over WiFi using the following script: $ python wifi_media_get_thumbnail.py See the help for parameter definitions: $ python wifi_media_get_thumbnail.py --help usage: wifi_media_get_thumbnail.py [-h] Get the thumbnail for a media file. optional arguments: -h, --help show this help message and exit The Kotlin file for this tutorial can be found on Github. To perform the tutorial, run the Android Studio project, select “Tutorial 8” from the dropdown and click on “Perform.” This requires: a GoPro is already connected via BLE, i.e. that Tutorial 1 was already run. a GoPro is already connected via Wifi, i.e. that Tutorial 5 was already run. You can check the BLE and Wifi statuses at the top of the app. Perform Tutorial 8 This will start the tutorial and log to the screen as it executes. When the tutorial is complete, click “Exit Tutorial” to return to the Tutorial selection screen. Setup We must first connect to The GoPro’s WiFi Access Point (AP) as was discussed in the Connecting to Wifi tutorial. Get Media List Now that we are are connected via WiFi, we will get the media list using the same procedure to send HTTP commands as in the previous tutorial. We get the media list via Get Media List. This will return a JSON structure of all of the media files (pictures, videos) on the camera with corresponding information about each media file. Let’s build the endpoint, send the GET request, and check the response for errors. Any errors will raise an exception. python kotlin url = GOPRO_BASE_URL + \"/gopro/media/list\" response = requests.get(url) response.raise_for_status() val response = wifi.get(GOPRO_BASE_URL + \"gopro/media/list\") Lastly, we print the response’s JSON data: python kotlin logger.info(f\"Response: {json.dumps(response.json(), indent=4)}\") The response will log as such (abbreviated for brevity): INFO:root:Getting the media list: sending http://10.5.5.9:8080/gopro/media/list INFO:root:Command sent successfully INFO:root:Response: { \"id\": \"2510746051348624995\", \"media\": [ { \"d\": \"100GOPRO\", \"fs\": [ { \"n\": \"GOPR0987.JPG\", \"cre\": \"1618583762\", \"mod\": \"1618583762\", \"s\": \"5013927\" }, { \"n\": \"GOPR0988.JPG\", \"cre\": \"1618583764\", \"mod\": \"1618583764\", \"s\": \"5009491\" }, { \"n\": \"GOPR0989.JPG\", \"cre\": \"1618583766\", \"mod\": \"1618583766\", \"s\": \"5031861\" }, { \"n\": \"GX010990.MP4\", \"cre\": \"1451608343\", \"mod\": \"1451608343\", \"glrv\": \"806586\", \"ls\": \"-1\", \"s\": \"10725219\" }, Timber.i(\"Files in media list: ${prettyJson.encodeToString(fileList)}\") The response will log as such (abbreviated for brevity): GET request to: http://10.5.5.9:8080/gopro/media/list Complete media list: { \"id\": \"4386457835676877283\", \"media\": [ { \"d\": \"100GOPRO\", \"fs\": [ { \"n\": \"GOPR0232.JPG\", \"cre\": \"1748997965\", \"mod\": \"1748997965\", \"s\": \"7618898\" }, { \"n\": \"GOPR0233.JPG\", \"cre\": \"1748998273\", \"mod\": \"1748998273\", \"s\": \"7653472\" }, ... { \"n\": \"GX010259.MP4\", \"cre\": \"1677828860\", \"mod\": \"1677828860\", \"glrv\": \"943295\", \"ls\": \"-1\", \"s\": \"9788009\" } ] } ] } The media list format is defined in the Media Model. We won’t be rehashing that here but will provide examples below of using the media list. One common functionality is to get the list of media file names, which can be done as such: python kotlin print([x[\"n\"] for x in media_list[\"media\"][0][\"fs\"]]) That is, access the list at the fs tag at the first element of the media tag, then make a list of all of the names (n tag of each element) in the fs list. val fileList = response[\"media\"]?.jsonArray?.first()?.jsonObject?.get(\"fs\")?.jsonArray?.map { mediaEntry -> mediaEntry.jsonObject[\"n\"] }?.map { it.toString().replace(\"\\\"\", \"\") } That is: Access the JSON array at the fs tag at the first element of the media tag Make a list of all of the names (n tag of each element) in the fs list. Map this list to string and remove backslashes Media List Operations Whereas all of the WiFi commands described until now have returned JSON responses, most of the media list operations return binary data. From an HTTP perspective, the behavior is the same. However, the GET response will contain a large binary chunk of information so we will loop through it with the requests library as such, writing up to 8 kB at a time: GoProOpen GoPro user devicediskGoProOpen GoPro user devicediskPC connected to WiFi APloop[write until complete]Get Media List (GET)Media List (HTTP 200 OK)Command Request (GET)Binary Response (HTTP 200 OK)write <= 8K Download Media File TODO Handle directory in media list. The next command we will be sending is Download Media. Specifically, we will be downloading a photo. The camera must have at least one photo in its media list in order for this to work. First, we get the media list as in Get Media List . Then we search through the list of file names in the media list looking for a photo (i.e. a file whose name ends in .jpg). Once we find a photo, we proceed: python kotlin media_list = get_media_list() photo: Optional[str] = None for media_file in [x[\"n\"] for x in media_list[\"media\"][0][\"fs\"]]: if media_file.lower().endswith(\".jpg\"): logger.info(f\"found a photo: {media_file}\") photo = media_file break val photo = fileList?.firstOrNull { it.endsWith(ignoreCase = true, suffix = \"jpg\") } ?: throw Exception(\"Not able to find a .jpg in the media list\") Timber.i(\"Found a photo: $photo\") Now let’s build the endpoint, send the GET request, and check the response for errors. Any errors will raise an exception. The endpoint will start with “videos” for both photos and videos python kotlin url = GOPRO_BASE_URL + f\"videos/DCIM/100GOPRO/{photo}\" with requests.get(url, stream=True) as request: request.raise_for_status() Lastly, we iterate through the binary content in 8 kB chunks, writing to a local file: file = photo.split(\".\")[0] + \".jpg\" with open(file, \"wb\") as f: logger.info(f\"receiving binary stream to {file}...\") for chunk in request.iter_content(chunk_size=8192): f.write(chunk) return wifi.getFile( GOPRO_BASE_URL + \"videos/DCIM/100GOPRO/$photo\", appContainer.applicationContext ) TODO FIX THIS This will log as such: python kotlin INFO:root:found a photo: GOPR0987.JPG INFO:root:Downloading GOPR0987.JPG INFO:root:Sending: http://10.5.5.9:8080/videos/DCIM/100GOPRO/GOPR0987.JPG INFO:root:receiving binary stream to GOPR0987.jpg... Once complete, the GOPR0987_thumbnail.jpg file will be available from where the demo script was called. Found a photo: GOPR0232.JPG Downloading photo: GOPR0232.JPG... Once complete, the photo will display in the tutorial window. Get Media Thumbnail The next command we will be sending is Get Media thumbnail . Specifically, we will be getting the thumbnail for a photo. The camera must have at least one photo in its media list in order for this to work. There is a separate commandto get a media “screennail” First, we get the media list as in Get Media List . Then we search through the list of file names in the media list looking for a photo (i.e. a file whose name ends in .jpg). Once we find a photo, we proceed: python kotlin media_list = get_media_list() photo: Optional[str] = None for media_file in [x[\"n\"] for x in media_list[\"media\"][0][\"fs\"]]: if media_file.lower().endswith(\".jpg\"): logger.info(f\"found a photo: {media_file}\") photo = media_file break TODO Now let’s build the endpoint, send the GET request, and check the response for errors. Any errors will raise an exception. python kotlin url = GOPRO_BASE_URL + f\"/gopro/media/thumbnail?path=100GOPRO/{photo}\" with requests.get(url, stream=True) as request: request.raise_for_status() Lastly, we iterate through the binary content in 8 kB chunks, writing to a local file: file = photo.split(\".\")[0] + \".jpg\" with open(file, \"wb\") as f: logger.info(f\"receiving binary stream to {file}...\") for chunk in request.iter_content(chunk_size=8192): f.write(chunk) TODO This will log as such: python kotlin INFO:root:found a photo: GOPR0987.JPG INFO:root:Getting the thumbnail for GOPR0987.JPG INFO:root:Sending: http://10.5.5.9:8080/gopro/media/thumbnail?path=100GOPRO/GOPR0987.JPG INFO:root:receiving binary stream to GOPR0987_thumbnail.jpg... TODO Troubleshooting See the previous tutorial’s troubleshooting section. Good Job! Congratulations 🤙 You can now query the GoPro’s media list and retrieve binary information for media file. This is currently last tutorial. Stay tuned for more 👍 At this point you should be able to start creating a useful example using the Open GoPro Interface. For some inspiration check out some of the demos.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/camera-media-list#"
+ },
+ {
+ "title": "Tutorial 9: Camera on the Home Network: ",
+ "excerpt": "This document will provide a walk-through tutorial to use the Open GoPro Interface to configure and demonstrate the Camera on the Home Network (COHN) feature. It is recommended that you have first completed the connecting BLE, sending commands, parsing responses, protobuf, and connecting WiFi tutorials before proceeding. Requirements It is assumed that the hardware and software requirements from the connecting BLE tutorial are present and configured correctly. The scripts that will be used for this tutorial can be found in the Tutorial 9 Folder. Just Show me the Demo(s)!! python kotlin Each of the scripts for this tutorial can be found in the Tutorial 9 directory. Python >= 3.9 and < 3.12 must be used as specified in the requirements Provision COHN You can provision the GoPro for COHN to communicate via a network via: $ python provision_cohn.py See the help for parameter definitions: $ python provision_cohn.py --help usage: provision_cohn.py [-h] [-i IDENTIFIER] [-c CERTIFICATE] ssid password Provision COHN via BLE to be ready for communication. positional arguments: ssid SSID of network to connect to password Password of network to connect to options: -h, --help show this help message and exit -i IDENTIFIER, --identifier IDENTIFIER Last 4 digits of GoPro serial number, which is the last 4 digits of the default camera SSID. If not used, first discovered GoPro will be connected to -c CERTIFICATE, --certificate CERTIFICATE Path to write retrieved COHN certificate. Communicate via COHN You can see an example of communicating HTTPS via COHN (assuming it has already been provisioned) via: $ python communicate_via_cohn.py See the help for parameter definitions: $ python communicate_via_cohn.py --help usage: communicate_via_cohn.py [-h] ip_address username password certificate Demonstrate HTTPS communication via COHN. positional arguments: ip_address IP Address of camera on the home network username COHN username password COHN password certificate Path to read COHN cert from. options: -h, --help show this help message and exit TODO Setup We must first connect to BLE as was discussed in the connecting BLE tutorial. The GoPro must then be connected to an access point as was discussed in the Connecting WiFi Tutorial. For all of the BLE operations, we are using the same ResponseManager class that was defined in the Protobuf tutorial. COHN Overview The Camera on the Home Network feature allows the GoPro to connect (in Station Mode) to an Access Point (AP) such as a router in order to be controlled over a local network via the HTTP API. In order to protect users who connect to a network that includes Bad Actors, COHN uses SSL/TLS so that command and responses are sent securely encrypted via https:// rather than http://. Once COHN is provisioned it is possible to control the GoPro without a BLE connection by communicating via HTTPS over the provisioned network. Provisioning In order to use the COHN capability, the GoPro must first be provisioned for COHN via BLE. At a high level, the provisioning process is as follows: Connect the GoPro to an access point Instruct the GoPro to create a COHN Certificate Get the created COHN certificate Get the COHN status to retrieve and store COHN credentials for future use A summary of this process is shown here and will be expanded upon in the following sections: Provision COHN Set Date Time While not explicitly part of of the provisioning process, it is important that the GoPro’s date and time are correct so that it generates a valid SSL certificate. This can be done manually through the camera UI or programatically using the Set Local Datetime command. For the provisioning demo discussed in this tutorial, this is done programatically: python kotlin The code shown here can be found in provision_cohn.py We’re using the pytz and tzlocal libraries to find the timezone offset and daylight savings time status. In the set_date_time method, we send the request and wait to receive the successful response: datetime_request = bytearray( [ 0x0F, Command ID 10, Length of following datetime parameter *now.year.to_bytes(2, \"big\", signed=False), uint16 year now.month, now.day, now.hour, now.minute, now.second, *offset.to_bytes(2, \"big\", signed=True), int16 offset in minutes is_dst, ] ) datetime_request.insert(0, len(datetime_request)) await manager.client.write_gatt_char(GoProUuid.COMMAND_REQ_UUID.value, datetime_request, response=True) response = await manager.get_next_response_as_tlv() which logs as: Setting the camera's date and time to 2024-04-04 13:00:05.097305-07:00:-420 is_dst=True Writing: 0c:0f:0a:07:e8:04:04:0d:00:05:fe:5c:01 Received response at GoProUuid.COMMAND_RSP_UUID: 02:0f:00 Successfully set the date time. TODO Create the COHN Certificate Now that the GoPro’s date and time are valid and it has been connected to an Access Point, we can continue to provision COHN. Let’s instruct the GoPro to Create a COHN certificate. python kotlin create_request = bytearray( [ 0xF1, Feature ID 0x67, Action ID *proto.RequestCreateCOHNCert().SerializePartialToString(), ] ) create_request.insert(0, len(create_request)) await manager.client.write_gatt_char(GoProUuid.COMMAND_REQ_UUID.value, create_request, response=True) response := await manager.get_next_response_as_protobuf() which logs as: Creating a new COHN certificate. Writing: 02:f1:67 Received response at GoProUuid.COMMAND_RSP_UUID: 04:f1:e7:08:01 COHN certificate successfully created TODO You may notice that the provisioning demo first Clears the COHN Certificate. This is is only to ensure a consistent starting state in the case that COHN has already been provisioned. It is not necessary to clear the certificate if COHN has not yet been provisioned. Get the COHN Credentials At this point the GoPro has created the certificate and is in the process of provisioning COHN. We now need to get the COHN credentials that will be used for HTTPS communication. These are: COHN certificate Basic auth username Baisc auth password IP Address of COHN network We can immediately get the COHN certificate as such: python kotlin cert_request = bytearray( [ 0xF5, Feature ID 0x6E, Action ID *proto.RequestCOHNCert().SerializePartialToString(), ] ) cert_request.insert(0, len(cert_request)) await manager.client.write_gatt_char(GoProUuid.QUERY_REQ_UUID.value, cert_request, response=True) response := await manager.get_next_response_as_protobuf(): cert_response: proto.ResponseCOHNCert = response.data type: ignore return cert_response.cert TODO For the remaining credentials, we need to wait until the COHN network is connected. That is, we need to Get COHN Status until we receive a status where the state is set to COHN_STATE_NetworkConnected. This final status contains the remaining credentials: username, password, and IP Address. To do this, we first register to receive asynchronous COHN status updates: python kotlin status_request = bytearray( [ 0xF5, Feature ID 0x6F, Action ID *proto.RequestGetCOHNStatus(register_cohn_status=True).SerializePartialToString(), ] ) status_request.insert(0, len(status_request)) await manager.client.write_gatt_char(GoProUuid.QUERY_REQ_UUID.value, status_request, response=True) TODO Then we continuously receive and check the updates until we receive the desired status: python kotlin while response := await manager.get_next_response_as_protobuf(): cohn_status: proto.NotifyCOHNStatus = response.data type: ignore if cohn_status.state == proto.EnumCOHNNetworkState.COHN_STATE_NetworkConnected: return cohn_status This will all display in the log as such: Checking COHN status until provisioning is complete Writing: 04:f5:6f:08:01 ... Received response at GoProUuid.QUERY_RSP_UUID: 20:47:f5:ef:08:01:10:1b:1a:05:67:6f:70:72:6f:22:0c:47:7a:74 Received response at GoProUuid.QUERY_RSP_UUID: 80:32:6d:36:59:4d:76:4c:41:6f:2a:0e:31:39:32:2e:31:36:38:2e Received response at GoProUuid.QUERY_RSP_UUID: 81:35:30:2e:31:30:33:30:01:3a:0a:64:61:62:75:67:64:61:62:75 Received response at GoProUuid.QUERY_RSP_UUID: 82:67:42:0c:32:34:37:34:66:37:66:36:36:31:30:34 Received COHN Status: status: COHN_PROVISIONED state: COHN_STATE_NetworkConnected username: \"gopro\" password: \"Gzt2m6YMvLAo\" ipaddress: \"192.168.50.103\" enabled: true ssid: \"dabugdabug\" macaddress: \"2474f7f66104\" Successfully provisioned COHN. TODO Finally we accumulate all of the credentials and log them, also storing the certificate to a cohn.crt file: python kotlin credentials = await provision_cohn(manager) with open(certificate, \"w\") as fp: fp.write(credentials.certificate) logger.info(f\"Certificate written to {certificate.resolve()}\") { \"certificate\": \"-----BEGIN CERTIFICATE-----\\nMIIDnzCCAoegAwIBAgIUC7DGLtJJ61TzRY/mYQyhOegnz6cwDQYJKoZIhvcNAQ EL\\nBQAwaTELMAkGA1UEBhMCVVMxCzAJBgNVBAgMAkNBMRIwEAYDVQQHDAlTYW4gTWF0\\nZW8xDjAMBg NVBAoMBUdvUHJvMQ0wCwYDVQQLDARIZXJvMRowGAYDVQQDDBFHb1By\\nbyBDYW1lcmEgUm9vdDAeFw0y NDA0MDQyMDAwMTJaFw0zNDA0MDIyMDAwMTJaMGkx\\nCzAJBgNVBAYTAlVTMQswCQYDVQQIDAJDQTESMB AGA1UEBwwJU2FuIE1hdGVvMQ4w\\nDAYDVQQKDAVHb1BybzENMAsGA1UECwwESGVybzEaMBgGA1UEAwwR R29Qcm8gQ2Ft\\nZXJhIFJvb3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC05o1QIN5r\\n PmtTntzpzBQvfq64OM1j/tjdNCJsyB9/ipPrPcKdItOy+5gZZF8iOFiw8cG8O2nA\\nvLSIJkpQ6d3cuE 48nAQpc1+jJzskM7Vgqc/i43OqnB8iTKjtNJgj+lJtreQBNJw7\\nf00a0GbbUJMo6DhaW58ZIsOJKu3i +w8w+LNEZECfDN6RMSmkYoLXaHeKAlvhlRYv\\nxkNO7pB2OwhbD9awgzKVTiKvZ8Hrxl6lGlH5SHHimU uo2O1yiNKDWv+MhirCVnup\\nVvP/N5S+230KpXreEnHmo65fsHmdM11qYu8WJXGzOViCnQi24wgCuoMx np9hAeKs\\nVj4vxhyCu8gZAgMBAAGjPzA9MA8GA1UdEwQIMAYBAf8CAQAwCwYDVR0PBAQDAgGG\\nMB0G A1UdDgQWBBTYDT4QXVDsi23ukLr2ohJk5+8+gDANBgkqhkiG9w0BAQsFAAOC\\nAQEAU4Z9120CGtRGo3 QfWEy66BGdqI6ohdudmb/3qag0viXag2FyWar18lRFiEWc\\nZcsqw6i0CM6lKNVUluEsSBiGGVAbAHKu +fcpId5NLEI7G1XY5MFRHMIMi4PNKbJr\\nVi0ks/biMy7u9++FOBgmCXGAdbMJBfe2gxEJNdyU6wjgGs 2o402/parrWN8x9J+k\\ndBgYqiKpZK0Fad/qM4ivbgkMijXhGFODhWs/GlQWnPeaLusRnn3T/w2CsFzM kf0i\\n6fFT3FAQBU5LCZs1Fp/XFRrnFMp+sNhbmdfnI9EDyZOXzlRS4O48k/AW/nSkCozk\\nugYW+61H /RYPVEgF4VNxRqn+uA==\\n-----END CERTIFICATE-----\\n\", \"username\": \"gopro\", \"password\": \"Gzt2m6YMvLAo\", \"ip_address\": \"192.168.50.103\" } Certificate written to C:\\Users\\user\\gopro\\OpenGoPro\\demos\\python\\tutorial\\tutorial_modules\\tutorial_9_cohn\\cohn.crt TODO Make sure to keep these credentials for use in the next section. Communicating via COHN Once the GoPro has provisioned for COHN, we can use the stored credentials for HTTPS communication. For the setup of this demo, there is no pre-existing BLE or WiFi connection to the GoPro. We are only going to be using HTTPS over the provisioned home network for communication. In order to demonstrate COHN communication we are going to Get the Camera State. python kotlin The code shown below is taken from communicate_via_cohn.py. The credentials logged and stored from the previous demo must be passed in as command line arguments to this script. Run python communicate_via_cohn.py --help for usage. We’re going to use the requests library to perform the HTTPS request. First let’s build the url using the ip_address CLI argument: url = f\"https://{ip_address}\" + \"/gopro/camera/state\" Then let’s build the basic auth token from the username and password CLI arguments: token = b64encode(f\"{username}:{password}\".encode(\"utf-8\")).decode(\"ascii\") Lastly we build and send the request using the above endpoint and token combined with the path to the certificate from the CLI certificate argument: response = requests.get( url, timeout=10, headers={\"Authorization\": f\"Basic {token}\"}, verify=str(certificate), ) logger.info(f\"Response: {json.dumps(response.json(), indent=4)}\") TODO This should result in logging the complete cameras state, truncated here for brevity: Sending: https://192.168.50.103/gopro/camera/state Command sent successfully Response: { \"status\": { \"1\": 1, \"2\": 4, \"3\": 0, \"4\": 255, \"6\": 0, \"8\": 0, \"9\": 0, ... \"settings\": { \"2\": 1, \"3\": 0, \"5\": 0, \"6\": 0, \"13\": 0, ... See the sending Wifi commands tutorial for more information on this and other HTTP(S) functionality. Quiz time! 📚 ✏️ Troubleshooting See the first tutorial’s troubleshooting section to troubleshoot any BLE problems. See the Sending Wifi Command tutorial’s troubleshooting section to troubleshoot HTTP communication. Good Job! Congratulations 🤙 You have now provisioned COHN and performed an HTTPS operation. In the future, you can now communicate with the GoPro over your home network without needing a direct BLE or WiFi connection.",
+ "categories": [],
+ "tags": [],
+ "url": "/OpenGoPro/tutorials/cohn#"
+ },]
\ No newline at end of file
diff --git a/assets/js/lunr/lunr.js b/assets/js/lunr/lunr.js
new file mode 100644
index 00000000..6aa370fb
--- /dev/null
+++ b/assets/js/lunr/lunr.js
@@ -0,0 +1,3475 @@
+/**
+ * lunr - http://lunrjs.com - A bit like Solr, but much smaller and not as bright - 2.3.9
+ * Copyright (C) 2020 Oliver Nightingale
+ * @license MIT
+ */
+
+;(function(){
+
+/**
+ * A convenience function for configuring and constructing
+ * a new lunr Index.
+ *
+ * A lunr.Builder instance is created and the pipeline setup
+ * with a trimmer, stop word filter and stemmer.
+ *
+ * This builder object is yielded to the configuration function
+ * that is passed as a parameter, allowing the list of fields
+ * and other builder parameters to be customised.
+ *
+ * All documents _must_ be added within the passed config function.
+ *
+ * @example
+ * var idx = lunr(function () {
+ * this.field('title')
+ * this.field('body')
+ * this.ref('id')
+ *
+ * documents.forEach(function (doc) {
+ * this.add(doc)
+ * }, this)
+ * })
+ *
+ * @see {@link lunr.Builder}
+ * @see {@link lunr.Pipeline}
+ * @see {@link lunr.trimmer}
+ * @see {@link lunr.stopWordFilter}
+ * @see {@link lunr.stemmer}
+ * @namespace {function} lunr
+ */
+var lunr = function (config) {
+ var builder = new lunr.Builder
+
+ builder.pipeline.add(
+ lunr.trimmer,
+ lunr.stopWordFilter,
+ lunr.stemmer
+ )
+
+ builder.searchPipeline.add(
+ lunr.stemmer
+ )
+
+ config.call(builder, builder)
+ return builder.build()
+}
+
+lunr.version = "2.3.9"
+/*!
+ * lunr.utils
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * A namespace containing utils for the rest of the lunr library
+ * @namespace lunr.utils
+ */
+lunr.utils = {}
+
+/**
+ * Print a warning message to the console.
+ *
+ * @param {String} message The message to be printed.
+ * @memberOf lunr.utils
+ * @function
+ */
+lunr.utils.warn = (function (global) {
+ /* eslint-disable no-console */
+ return function (message) {
+ if (global.console && console.warn) {
+ console.warn(message)
+ }
+ }
+ /* eslint-enable no-console */
+})(this)
+
+/**
+ * Convert an object to a string.
+ *
+ * In the case of `null` and `undefined` the function returns
+ * the empty string, in all other cases the result of calling
+ * `toString` on the passed object is returned.
+ *
+ * @param {Any} obj The object to convert to a string.
+ * @return {String} string representation of the passed object.
+ * @memberOf lunr.utils
+ */
+lunr.utils.asString = function (obj) {
+ if (obj === void 0 || obj === null) {
+ return ""
+ } else {
+ return obj.toString()
+ }
+}
+
+/**
+ * Clones an object.
+ *
+ * Will create a copy of an existing object such that any mutations
+ * on the copy cannot affect the original.
+ *
+ * Only shallow objects are supported, passing a nested object to this
+ * function will cause a TypeError.
+ *
+ * Objects with primitives, and arrays of primitives are supported.
+ *
+ * @param {Object} obj The object to clone.
+ * @return {Object} a clone of the passed object.
+ * @throws {TypeError} when a nested object is passed.
+ * @memberOf Utils
+ */
+lunr.utils.clone = function (obj) {
+ if (obj === null || obj === undefined) {
+ return obj
+ }
+
+ var clone = Object.create(null),
+ keys = Object.keys(obj)
+
+ for (var i = 0; i < keys.length; i++) {
+ var key = keys[i],
+ val = obj[key]
+
+ if (Array.isArray(val)) {
+ clone[key] = val.slice()
+ continue
+ }
+
+ if (typeof val === 'string' ||
+ typeof val === 'number' ||
+ typeof val === 'boolean') {
+ clone[key] = val
+ continue
+ }
+
+ throw new TypeError("clone is not deep and does not support nested objects")
+ }
+
+ return clone
+}
+lunr.FieldRef = function (docRef, fieldName, stringValue) {
+ this.docRef = docRef
+ this.fieldName = fieldName
+ this._stringValue = stringValue
+}
+
+lunr.FieldRef.joiner = "/"
+
+lunr.FieldRef.fromString = function (s) {
+ var n = s.indexOf(lunr.FieldRef.joiner)
+
+ if (n === -1) {
+ throw "malformed field ref string"
+ }
+
+ var fieldRef = s.slice(0, n),
+ docRef = s.slice(n + 1)
+
+ return new lunr.FieldRef (docRef, fieldRef, s)
+}
+
+lunr.FieldRef.prototype.toString = function () {
+ if (this._stringValue == undefined) {
+ this._stringValue = this.fieldName + lunr.FieldRef.joiner + this.docRef
+ }
+
+ return this._stringValue
+}
+/*!
+ * lunr.Set
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * A lunr set.
+ *
+ * @constructor
+ */
+lunr.Set = function (elements) {
+ this.elements = Object.create(null)
+
+ if (elements) {
+ this.length = elements.length
+
+ for (var i = 0; i < this.length; i++) {
+ this.elements[elements[i]] = true
+ }
+ } else {
+ this.length = 0
+ }
+}
+
+/**
+ * A complete set that contains all elements.
+ *
+ * @static
+ * @readonly
+ * @type {lunr.Set}
+ */
+lunr.Set.complete = {
+ intersect: function (other) {
+ return other
+ },
+
+ union: function () {
+ return this
+ },
+
+ contains: function () {
+ return true
+ }
+}
+
+/**
+ * An empty set that contains no elements.
+ *
+ * @static
+ * @readonly
+ * @type {lunr.Set}
+ */
+lunr.Set.empty = {
+ intersect: function () {
+ return this
+ },
+
+ union: function (other) {
+ return other
+ },
+
+ contains: function () {
+ return false
+ }
+}
+
+/**
+ * Returns true if this set contains the specified object.
+ *
+ * @param {object} object - Object whose presence in this set is to be tested.
+ * @returns {boolean} - True if this set contains the specified object.
+ */
+lunr.Set.prototype.contains = function (object) {
+ return !!this.elements[object]
+}
+
+/**
+ * Returns a new set containing only the elements that are present in both
+ * this set and the specified set.
+ *
+ * @param {lunr.Set} other - set to intersect with this set.
+ * @returns {lunr.Set} a new set that is the intersection of this and the specified set.
+ */
+
+lunr.Set.prototype.intersect = function (other) {
+ var a, b, elements, intersection = []
+
+ if (other === lunr.Set.complete) {
+ return this
+ }
+
+ if (other === lunr.Set.empty) {
+ return other
+ }
+
+ if (this.length < other.length) {
+ a = this
+ b = other
+ } else {
+ a = other
+ b = this
+ }
+
+ elements = Object.keys(a.elements)
+
+ for (var i = 0; i < elements.length; i++) {
+ var element = elements[i]
+ if (element in b.elements) {
+ intersection.push(element)
+ }
+ }
+
+ return new lunr.Set (intersection)
+}
+
+/**
+ * Returns a new set combining the elements of this and the specified set.
+ *
+ * @param {lunr.Set} other - set to union with this set.
+ * @return {lunr.Set} a new set that is the union of this and the specified set.
+ */
+
+lunr.Set.prototype.union = function (other) {
+ if (other === lunr.Set.complete) {
+ return lunr.Set.complete
+ }
+
+ if (other === lunr.Set.empty) {
+ return this
+ }
+
+ return new lunr.Set(Object.keys(this.elements).concat(Object.keys(other.elements)))
+}
+/**
+ * A function to calculate the inverse document frequency for
+ * a posting. This is shared between the builder and the index
+ *
+ * @private
+ * @param {object} posting - The posting for a given term
+ * @param {number} documentCount - The total number of documents.
+ */
+lunr.idf = function (posting, documentCount) {
+ var documentsWithTerm = 0
+
+ for (var fieldName in posting) {
+ if (fieldName == '_index') continue // Ignore the term index, its not a field
+ documentsWithTerm += Object.keys(posting[fieldName]).length
+ }
+
+ var x = (documentCount - documentsWithTerm + 0.5) / (documentsWithTerm + 0.5)
+
+ return Math.log(1 + Math.abs(x))
+}
+
+/**
+ * A token wraps a string representation of a token
+ * as it is passed through the text processing pipeline.
+ *
+ * @constructor
+ * @param {string} [str=''] - The string token being wrapped.
+ * @param {object} [metadata={}] - Metadata associated with this token.
+ */
+lunr.Token = function (str, metadata) {
+ this.str = str || ""
+ this.metadata = metadata || {}
+}
+
+/**
+ * Returns the token string that is being wrapped by this object.
+ *
+ * @returns {string}
+ */
+lunr.Token.prototype.toString = function () {
+ return this.str
+}
+
+/**
+ * A token update function is used when updating or optionally
+ * when cloning a token.
+ *
+ * @callback lunr.Token~updateFunction
+ * @param {string} str - The string representation of the token.
+ * @param {Object} metadata - All metadata associated with this token.
+ */
+
+/**
+ * Applies the given function to the wrapped string token.
+ *
+ * @example
+ * token.update(function (str, metadata) {
+ * return str.toUpperCase()
+ * })
+ *
+ * @param {lunr.Token~updateFunction} fn - A function to apply to the token string.
+ * @returns {lunr.Token}
+ */
+lunr.Token.prototype.update = function (fn) {
+ this.str = fn(this.str, this.metadata)
+ return this
+}
+
+/**
+ * Creates a clone of this token. Optionally a function can be
+ * applied to the cloned token.
+ *
+ * @param {lunr.Token~updateFunction} [fn] - An optional function to apply to the cloned token.
+ * @returns {lunr.Token}
+ */
+lunr.Token.prototype.clone = function (fn) {
+ fn = fn || function (s) { return s }
+ return new lunr.Token (fn(this.str, this.metadata), this.metadata)
+}
+/*!
+ * lunr.tokenizer
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * A function for splitting a string into tokens ready to be inserted into
+ * the search index. Uses `lunr.tokenizer.separator` to split strings, change
+ * the value of this property to change how strings are split into tokens.
+ *
+ * This tokenizer will convert its parameter to a string by calling `toString` and
+ * then will split this string on the character in `lunr.tokenizer.separator`.
+ * Arrays will have their elements converted to strings and wrapped in a lunr.Token.
+ *
+ * Optional metadata can be passed to the tokenizer, this metadata will be cloned and
+ * added as metadata to every token that is created from the object to be tokenized.
+ *
+ * @static
+ * @param {?(string|object|object[])} obj - The object to convert into tokens
+ * @param {?object} metadata - Optional metadata to associate with every token
+ * @returns {lunr.Token[]}
+ * @see {@link lunr.Pipeline}
+ */
+lunr.tokenizer = function (obj, metadata) {
+ if (obj == null || obj == undefined) {
+ return []
+ }
+
+ if (Array.isArray(obj)) {
+ return obj.map(function (t) {
+ return new lunr.Token(
+ lunr.utils.asString(t).toLowerCase(),
+ lunr.utils.clone(metadata)
+ )
+ })
+ }
+
+ var str = obj.toString().toLowerCase(),
+ len = str.length,
+ tokens = []
+
+ for (var sliceEnd = 0, sliceStart = 0; sliceEnd <= len; sliceEnd++) {
+ var char = str.charAt(sliceEnd),
+ sliceLength = sliceEnd - sliceStart
+
+ if ((char.match(lunr.tokenizer.separator) || sliceEnd == len)) {
+
+ if (sliceLength > 0) {
+ var tokenMetadata = lunr.utils.clone(metadata) || {}
+ tokenMetadata["position"] = [sliceStart, sliceLength]
+ tokenMetadata["index"] = tokens.length
+
+ tokens.push(
+ new lunr.Token (
+ str.slice(sliceStart, sliceEnd),
+ tokenMetadata
+ )
+ )
+ }
+
+ sliceStart = sliceEnd + 1
+ }
+
+ }
+
+ return tokens
+}
+
+/**
+ * The separator used to split a string into tokens. Override this property to change the behaviour of
+ * `lunr.tokenizer` behaviour when tokenizing strings. By default this splits on whitespace and hyphens.
+ *
+ * @static
+ * @see lunr.tokenizer
+ */
+lunr.tokenizer.separator = /[\s\-]+/
+/*!
+ * lunr.Pipeline
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * lunr.Pipelines maintain an ordered list of functions to be applied to all
+ * tokens in documents entering the search index and queries being ran against
+ * the index.
+ *
+ * An instance of lunr.Index created with the lunr shortcut will contain a
+ * pipeline with a stop word filter and an English language stemmer. Extra
+ * functions can be added before or after either of these functions or these
+ * default functions can be removed.
+ *
+ * When run the pipeline will call each function in turn, passing a token, the
+ * index of that token in the original list of all tokens and finally a list of
+ * all the original tokens.
+ *
+ * The output of functions in the pipeline will be passed to the next function
+ * in the pipeline. To exclude a token from entering the index the function
+ * should return undefined, the rest of the pipeline will not be called with
+ * this token.
+ *
+ * For serialisation of pipelines to work, all functions used in an instance of
+ * a pipeline should be registered with lunr.Pipeline. Registered functions can
+ * then be loaded. If trying to load a serialised pipeline that uses functions
+ * that are not registered an error will be thrown.
+ *
+ * If not planning on serialising the pipeline then registering pipeline functions
+ * is not necessary.
+ *
+ * @constructor
+ */
+lunr.Pipeline = function () {
+ this._stack = []
+}
+
+lunr.Pipeline.registeredFunctions = Object.create(null)
+
+/**
+ * A pipeline function maps lunr.Token to lunr.Token. A lunr.Token contains the token
+ * string as well as all known metadata. A pipeline function can mutate the token string
+ * or mutate (or add) metadata for a given token.
+ *
+ * A pipeline function can indicate that the passed token should be discarded by returning
+ * null, undefined or an empty string. This token will not be passed to any downstream pipeline
+ * functions and will not be added to the index.
+ *
+ * Multiple tokens can be returned by returning an array of tokens. Each token will be passed
+ * to any downstream pipeline functions and all will returned tokens will be added to the index.
+ *
+ * Any number of pipeline functions may be chained together using a lunr.Pipeline.
+ *
+ * @interface lunr.PipelineFunction
+ * @param {lunr.Token} token - A token from the document being processed.
+ * @param {number} i - The index of this token in the complete list of tokens for this document/field.
+ * @param {lunr.Token[]} tokens - All tokens for this document/field.
+ * @returns {(?lunr.Token|lunr.Token[])}
+ */
+
+/**
+ * Register a function with the pipeline.
+ *
+ * Functions that are used in the pipeline should be registered if the pipeline
+ * needs to be serialised, or a serialised pipeline needs to be loaded.
+ *
+ * Registering a function does not add it to a pipeline, functions must still be
+ * added to instances of the pipeline for them to be used when running a pipeline.
+ *
+ * @param {lunr.PipelineFunction} fn - The function to check for.
+ * @param {String} label - The label to register this function with
+ */
+lunr.Pipeline.registerFunction = function (fn, label) {
+ if (label in this.registeredFunctions) {
+ lunr.utils.warn('Overwriting existing registered function: ' + label)
+ }
+
+ fn.label = label
+ lunr.Pipeline.registeredFunctions[fn.label] = fn
+}
+
+/**
+ * Warns if the function is not registered as a Pipeline function.
+ *
+ * @param {lunr.PipelineFunction} fn - The function to check for.
+ * @private
+ */
+lunr.Pipeline.warnIfFunctionNotRegistered = function (fn) {
+ var isRegistered = fn.label && (fn.label in this.registeredFunctions)
+
+ if (!isRegistered) {
+ lunr.utils.warn('Function is not registered with pipeline. This may cause problems when serialising the index.\n', fn)
+ }
+}
+
+/**
+ * Loads a previously serialised pipeline.
+ *
+ * All functions to be loaded must already be registered with lunr.Pipeline.
+ * If any function from the serialised data has not been registered then an
+ * error will be thrown.
+ *
+ * @param {Object} serialised - The serialised pipeline to load.
+ * @returns {lunr.Pipeline}
+ */
+lunr.Pipeline.load = function (serialised) {
+ var pipeline = new lunr.Pipeline
+
+ serialised.forEach(function (fnName) {
+ var fn = lunr.Pipeline.registeredFunctions[fnName]
+
+ if (fn) {
+ pipeline.add(fn)
+ } else {
+ throw new Error('Cannot load unregistered function: ' + fnName)
+ }
+ })
+
+ return pipeline
+}
+
+/**
+ * Adds new functions to the end of the pipeline.
+ *
+ * Logs a warning if the function has not been registered.
+ *
+ * @param {lunr.PipelineFunction[]} functions - Any number of functions to add to the pipeline.
+ */
+lunr.Pipeline.prototype.add = function () {
+ var fns = Array.prototype.slice.call(arguments)
+
+ fns.forEach(function (fn) {
+ lunr.Pipeline.warnIfFunctionNotRegistered(fn)
+ this._stack.push(fn)
+ }, this)
+}
+
+/**
+ * Adds a single function after a function that already exists in the
+ * pipeline.
+ *
+ * Logs a warning if the function has not been registered.
+ *
+ * @param {lunr.PipelineFunction} existingFn - A function that already exists in the pipeline.
+ * @param {lunr.PipelineFunction} newFn - The new function to add to the pipeline.
+ */
+lunr.Pipeline.prototype.after = function (existingFn, newFn) {
+ lunr.Pipeline.warnIfFunctionNotRegistered(newFn)
+
+ var pos = this._stack.indexOf(existingFn)
+ if (pos == -1) {
+ throw new Error('Cannot find existingFn')
+ }
+
+ pos = pos + 1
+ this._stack.splice(pos, 0, newFn)
+}
+
+/**
+ * Adds a single function before a function that already exists in the
+ * pipeline.
+ *
+ * Logs a warning if the function has not been registered.
+ *
+ * @param {lunr.PipelineFunction} existingFn - A function that already exists in the pipeline.
+ * @param {lunr.PipelineFunction} newFn - The new function to add to the pipeline.
+ */
+lunr.Pipeline.prototype.before = function (existingFn, newFn) {
+ lunr.Pipeline.warnIfFunctionNotRegistered(newFn)
+
+ var pos = this._stack.indexOf(existingFn)
+ if (pos == -1) {
+ throw new Error('Cannot find existingFn')
+ }
+
+ this._stack.splice(pos, 0, newFn)
+}
+
+/**
+ * Removes a function from the pipeline.
+ *
+ * @param {lunr.PipelineFunction} fn The function to remove from the pipeline.
+ */
+lunr.Pipeline.prototype.remove = function (fn) {
+ var pos = this._stack.indexOf(fn)
+ if (pos == -1) {
+ return
+ }
+
+ this._stack.splice(pos, 1)
+}
+
+/**
+ * Runs the current list of functions that make up the pipeline against the
+ * passed tokens.
+ *
+ * @param {Array} tokens The tokens to run through the pipeline.
+ * @returns {Array}
+ */
+lunr.Pipeline.prototype.run = function (tokens) {
+ var stackLength = this._stack.length
+
+ for (var i = 0; i < stackLength; i++) {
+ var fn = this._stack[i]
+ var memo = []
+
+ for (var j = 0; j < tokens.length; j++) {
+ var result = fn(tokens[j], j, tokens)
+
+ if (result === null || result === void 0 || result === '') continue
+
+ if (Array.isArray(result)) {
+ for (var k = 0; k < result.length; k++) {
+ memo.push(result[k])
+ }
+ } else {
+ memo.push(result)
+ }
+ }
+
+ tokens = memo
+ }
+
+ return tokens
+}
+
+/**
+ * Convenience method for passing a string through a pipeline and getting
+ * strings out. This method takes care of wrapping the passed string in a
+ * token and mapping the resulting tokens back to strings.
+ *
+ * @param {string} str - The string to pass through the pipeline.
+ * @param {?object} metadata - Optional metadata to associate with the token
+ * passed to the pipeline.
+ * @returns {string[]}
+ */
+lunr.Pipeline.prototype.runString = function (str, metadata) {
+ var token = new lunr.Token (str, metadata)
+
+ return this.run([token]).map(function (t) {
+ return t.toString()
+ })
+}
+
+/**
+ * Resets the pipeline by removing any existing processors.
+ *
+ */
+lunr.Pipeline.prototype.reset = function () {
+ this._stack = []
+}
+
+/**
+ * Returns a representation of the pipeline ready for serialisation.
+ *
+ * Logs a warning if the function has not been registered.
+ *
+ * @returns {Array}
+ */
+lunr.Pipeline.prototype.toJSON = function () {
+ return this._stack.map(function (fn) {
+ lunr.Pipeline.warnIfFunctionNotRegistered(fn)
+
+ return fn.label
+ })
+}
+/*!
+ * lunr.Vector
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * A vector is used to construct the vector space of documents and queries. These
+ * vectors support operations to determine the similarity between two documents or
+ * a document and a query.
+ *
+ * Normally no parameters are required for initializing a vector, but in the case of
+ * loading a previously dumped vector the raw elements can be provided to the constructor.
+ *
+ * For performance reasons vectors are implemented with a flat array, where an elements
+ * index is immediately followed by its value. E.g. [index, value, index, value]. This
+ * allows the underlying array to be as sparse as possible and still offer decent
+ * performance when being used for vector calculations.
+ *
+ * @constructor
+ * @param {Number[]} [elements] - The flat list of element index and element value pairs.
+ */
+lunr.Vector = function (elements) {
+ this._magnitude = 0
+ this.elements = elements || []
+}
+
+
+/**
+ * Calculates the position within the vector to insert a given index.
+ *
+ * This is used internally by insert and upsert. If there are duplicate indexes then
+ * the position is returned as if the value for that index were to be updated, but it
+ * is the callers responsibility to check whether there is a duplicate at that index
+ *
+ * @param {Number} insertIdx - The index at which the element should be inserted.
+ * @returns {Number}
+ */
+lunr.Vector.prototype.positionForIndex = function (index) {
+ // For an empty vector the tuple can be inserted at the beginning
+ if (this.elements.length == 0) {
+ return 0
+ }
+
+ var start = 0,
+ end = this.elements.length / 2,
+ sliceLength = end - start,
+ pivotPoint = Math.floor(sliceLength / 2),
+ pivotIndex = this.elements[pivotPoint * 2]
+
+ while (sliceLength > 1) {
+ if (pivotIndex < index) {
+ start = pivotPoint
+ }
+
+ if (pivotIndex > index) {
+ end = pivotPoint
+ }
+
+ if (pivotIndex == index) {
+ break
+ }
+
+ sliceLength = end - start
+ pivotPoint = start + Math.floor(sliceLength / 2)
+ pivotIndex = this.elements[pivotPoint * 2]
+ }
+
+ if (pivotIndex == index) {
+ return pivotPoint * 2
+ }
+
+ if (pivotIndex > index) {
+ return pivotPoint * 2
+ }
+
+ if (pivotIndex < index) {
+ return (pivotPoint + 1) * 2
+ }
+}
+
+/**
+ * Inserts an element at an index within the vector.
+ *
+ * Does not allow duplicates, will throw an error if there is already an entry
+ * for this index.
+ *
+ * @param {Number} insertIdx - The index at which the element should be inserted.
+ * @param {Number} val - The value to be inserted into the vector.
+ */
+lunr.Vector.prototype.insert = function (insertIdx, val) {
+ this.upsert(insertIdx, val, function () {
+ throw "duplicate index"
+ })
+}
+
+/**
+ * Inserts or updates an existing index within the vector.
+ *
+ * @param {Number} insertIdx - The index at which the element should be inserted.
+ * @param {Number} val - The value to be inserted into the vector.
+ * @param {function} fn - A function that is called for updates, the existing value and the
+ * requested value are passed as arguments
+ */
+lunr.Vector.prototype.upsert = function (insertIdx, val, fn) {
+ this._magnitude = 0
+ var position = this.positionForIndex(insertIdx)
+
+ if (this.elements[position] == insertIdx) {
+ this.elements[position + 1] = fn(this.elements[position + 1], val)
+ } else {
+ this.elements.splice(position, 0, insertIdx, val)
+ }
+}
+
+/**
+ * Calculates the magnitude of this vector.
+ *
+ * @returns {Number}
+ */
+lunr.Vector.prototype.magnitude = function () {
+ if (this._magnitude) return this._magnitude
+
+ var sumOfSquares = 0,
+ elementsLength = this.elements.length
+
+ for (var i = 1; i < elementsLength; i += 2) {
+ var val = this.elements[i]
+ sumOfSquares += val * val
+ }
+
+ return this._magnitude = Math.sqrt(sumOfSquares)
+}
+
+/**
+ * Calculates the dot product of this vector and another vector.
+ *
+ * @param {lunr.Vector} otherVector - The vector to compute the dot product with.
+ * @returns {Number}
+ */
+lunr.Vector.prototype.dot = function (otherVector) {
+ var dotProduct = 0,
+ a = this.elements, b = otherVector.elements,
+ aLen = a.length, bLen = b.length,
+ aVal = 0, bVal = 0,
+ i = 0, j = 0
+
+ while (i < aLen && j < bLen) {
+ aVal = a[i], bVal = b[j]
+ if (aVal < bVal) {
+ i += 2
+ } else if (aVal > bVal) {
+ j += 2
+ } else if (aVal == bVal) {
+ dotProduct += a[i + 1] * b[j + 1]
+ i += 2
+ j += 2
+ }
+ }
+
+ return dotProduct
+}
+
+/**
+ * Calculates the similarity between this vector and another vector.
+ *
+ * @param {lunr.Vector} otherVector - The other vector to calculate the
+ * similarity with.
+ * @returns {Number}
+ */
+lunr.Vector.prototype.similarity = function (otherVector) {
+ return this.dot(otherVector) / this.magnitude() || 0
+}
+
+/**
+ * Converts the vector to an array of the elements within the vector.
+ *
+ * @returns {Number[]}
+ */
+lunr.Vector.prototype.toArray = function () {
+ var output = new Array (this.elements.length / 2)
+
+ for (var i = 1, j = 0; i < this.elements.length; i += 2, j++) {
+ output[j] = this.elements[i]
+ }
+
+ return output
+}
+
+/**
+ * A JSON serializable representation of the vector.
+ *
+ * @returns {Number[]}
+ */
+lunr.Vector.prototype.toJSON = function () {
+ return this.elements
+}
+/* eslint-disable */
+/*!
+ * lunr.stemmer
+ * Copyright (C) 2020 Oliver Nightingale
+ * Includes code from - http://tartarus.org/~martin/PorterStemmer/js.txt
+ */
+
+/**
+ * lunr.stemmer is an english language stemmer, this is a JavaScript
+ * implementation of the PorterStemmer taken from http://tartarus.org/~martin
+ *
+ * @static
+ * @implements {lunr.PipelineFunction}
+ * @param {lunr.Token} token - The string to stem
+ * @returns {lunr.Token}
+ * @see {@link lunr.Pipeline}
+ * @function
+ */
+lunr.stemmer = (function(){
+ var step2list = {
+ "ational" : "ate",
+ "tional" : "tion",
+ "enci" : "ence",
+ "anci" : "ance",
+ "izer" : "ize",
+ "bli" : "ble",
+ "alli" : "al",
+ "entli" : "ent",
+ "eli" : "e",
+ "ousli" : "ous",
+ "ization" : "ize",
+ "ation" : "ate",
+ "ator" : "ate",
+ "alism" : "al",
+ "iveness" : "ive",
+ "fulness" : "ful",
+ "ousness" : "ous",
+ "aliti" : "al",
+ "iviti" : "ive",
+ "biliti" : "ble",
+ "logi" : "log"
+ },
+
+ step3list = {
+ "icate" : "ic",
+ "ative" : "",
+ "alize" : "al",
+ "iciti" : "ic",
+ "ical" : "ic",
+ "ful" : "",
+ "ness" : ""
+ },
+
+ c = "[^aeiou]", // consonant
+ v = "[aeiouy]", // vowel
+ C = c + "[^aeiouy]*", // consonant sequence
+ V = v + "[aeiou]*", // vowel sequence
+
+ mgr0 = "^(" + C + ")?" + V + C, // [C]VC... is m>0
+ meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$", // [C]VC[V] is m=1
+ mgr1 = "^(" + C + ")?" + V + C + V + C, // [C]VCVC... is m>1
+ s_v = "^(" + C + ")?" + v; // vowel in stem
+
+ var re_mgr0 = new RegExp(mgr0);
+ var re_mgr1 = new RegExp(mgr1);
+ var re_meq1 = new RegExp(meq1);
+ var re_s_v = new RegExp(s_v);
+
+ var re_1a = /^(.+?)(ss|i)es$/;
+ var re2_1a = /^(.+?)([^s])s$/;
+ var re_1b = /^(.+?)eed$/;
+ var re2_1b = /^(.+?)(ed|ing)$/;
+ var re_1b_2 = /.$/;
+ var re2_1b_2 = /(at|bl|iz)$/;
+ var re3_1b_2 = new RegExp("([^aeiouylsz])\\1$");
+ var re4_1b_2 = new RegExp("^" + C + v + "[^aeiouwxy]$");
+
+ var re_1c = /^(.+?[^aeiou])y$/;
+ var re_2 = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/;
+
+ var re_3 = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/;
+
+ var re_4 = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/;
+ var re2_4 = /^(.+?)(s|t)(ion)$/;
+
+ var re_5 = /^(.+?)e$/;
+ var re_5_1 = /ll$/;
+ var re3_5 = new RegExp("^" + C + v + "[^aeiouwxy]$");
+
+ var porterStemmer = function porterStemmer(w) {
+ var stem,
+ suffix,
+ firstch,
+ re,
+ re2,
+ re3,
+ re4;
+
+ if (w.length < 3) { return w; }
+
+ firstch = w.substr(0,1);
+ if (firstch == "y") {
+ w = firstch.toUpperCase() + w.substr(1);
+ }
+
+ // Step 1a
+ re = re_1a
+ re2 = re2_1a;
+
+ if (re.test(w)) { w = w.replace(re,"$1$2"); }
+ else if (re2.test(w)) { w = w.replace(re2,"$1$2"); }
+
+ // Step 1b
+ re = re_1b;
+ re2 = re2_1b;
+ if (re.test(w)) {
+ var fp = re.exec(w);
+ re = re_mgr0;
+ if (re.test(fp[1])) {
+ re = re_1b_2;
+ w = w.replace(re,"");
+ }
+ } else if (re2.test(w)) {
+ var fp = re2.exec(w);
+ stem = fp[1];
+ re2 = re_s_v;
+ if (re2.test(stem)) {
+ w = stem;
+ re2 = re2_1b_2;
+ re3 = re3_1b_2;
+ re4 = re4_1b_2;
+ if (re2.test(w)) { w = w + "e"; }
+ else if (re3.test(w)) { re = re_1b_2; w = w.replace(re,""); }
+ else if (re4.test(w)) { w = w + "e"; }
+ }
+ }
+
+ // Step 1c - replace suffix y or Y by i if preceded by a non-vowel which is not the first letter of the word (so cry -> cri, by -> by, say -> say)
+ re = re_1c;
+ if (re.test(w)) {
+ var fp = re.exec(w);
+ stem = fp[1];
+ w = stem + "i";
+ }
+
+ // Step 2
+ re = re_2;
+ if (re.test(w)) {
+ var fp = re.exec(w);
+ stem = fp[1];
+ suffix = fp[2];
+ re = re_mgr0;
+ if (re.test(stem)) {
+ w = stem + step2list[suffix];
+ }
+ }
+
+ // Step 3
+ re = re_3;
+ if (re.test(w)) {
+ var fp = re.exec(w);
+ stem = fp[1];
+ suffix = fp[2];
+ re = re_mgr0;
+ if (re.test(stem)) {
+ w = stem + step3list[suffix];
+ }
+ }
+
+ // Step 4
+ re = re_4;
+ re2 = re2_4;
+ if (re.test(w)) {
+ var fp = re.exec(w);
+ stem = fp[1];
+ re = re_mgr1;
+ if (re.test(stem)) {
+ w = stem;
+ }
+ } else if (re2.test(w)) {
+ var fp = re2.exec(w);
+ stem = fp[1] + fp[2];
+ re2 = re_mgr1;
+ if (re2.test(stem)) {
+ w = stem;
+ }
+ }
+
+ // Step 5
+ re = re_5;
+ if (re.test(w)) {
+ var fp = re.exec(w);
+ stem = fp[1];
+ re = re_mgr1;
+ re2 = re_meq1;
+ re3 = re3_5;
+ if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) {
+ w = stem;
+ }
+ }
+
+ re = re_5_1;
+ re2 = re_mgr1;
+ if (re.test(w) && re2.test(w)) {
+ re = re_1b_2;
+ w = w.replace(re,"");
+ }
+
+ // and turn initial Y back to y
+
+ if (firstch == "y") {
+ w = firstch.toLowerCase() + w.substr(1);
+ }
+
+ return w;
+ };
+
+ return function (token) {
+ return token.update(porterStemmer);
+ }
+})();
+
+lunr.Pipeline.registerFunction(lunr.stemmer, 'stemmer')
+/*!
+ * lunr.stopWordFilter
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * lunr.generateStopWordFilter builds a stopWordFilter function from the provided
+ * list of stop words.
+ *
+ * The built in lunr.stopWordFilter is built using this generator and can be used
+ * to generate custom stopWordFilters for applications or non English languages.
+ *
+ * @function
+ * @param {Array} token The token to pass through the filter
+ * @returns {lunr.PipelineFunction}
+ * @see lunr.Pipeline
+ * @see lunr.stopWordFilter
+ */
+lunr.generateStopWordFilter = function (stopWords) {
+ var words = stopWords.reduce(function (memo, stopWord) {
+ memo[stopWord] = stopWord
+ return memo
+ }, {})
+
+ return function (token) {
+ if (token && words[token.toString()] !== token.toString()) return token
+ }
+}
+
+/**
+ * lunr.stopWordFilter is an English language stop word list filter, any words
+ * contained in the list will not be passed through the filter.
+ *
+ * This is intended to be used in the Pipeline. If the token does not pass the
+ * filter then undefined will be returned.
+ *
+ * @function
+ * @implements {lunr.PipelineFunction}
+ * @params {lunr.Token} token - A token to check for being a stop word.
+ * @returns {lunr.Token}
+ * @see {@link lunr.Pipeline}
+ */
+lunr.stopWordFilter = lunr.generateStopWordFilter([
+ 'a',
+ 'able',
+ 'about',
+ 'across',
+ 'after',
+ 'all',
+ 'almost',
+ 'also',
+ 'am',
+ 'among',
+ 'an',
+ 'and',
+ 'any',
+ 'are',
+ 'as',
+ 'at',
+ 'be',
+ 'because',
+ 'been',
+ 'but',
+ 'by',
+ 'can',
+ 'cannot',
+ 'could',
+ 'dear',
+ 'did',
+ 'do',
+ 'does',
+ 'either',
+ 'else',
+ 'ever',
+ 'every',
+ 'for',
+ 'from',
+ 'get',
+ 'got',
+ 'had',
+ 'has',
+ 'have',
+ 'he',
+ 'her',
+ 'hers',
+ 'him',
+ 'his',
+ 'how',
+ 'however',
+ 'i',
+ 'if',
+ 'in',
+ 'into',
+ 'is',
+ 'it',
+ 'its',
+ 'just',
+ 'least',
+ 'let',
+ 'like',
+ 'likely',
+ 'may',
+ 'me',
+ 'might',
+ 'most',
+ 'must',
+ 'my',
+ 'neither',
+ 'no',
+ 'nor',
+ 'not',
+ 'of',
+ 'off',
+ 'often',
+ 'on',
+ 'only',
+ 'or',
+ 'other',
+ 'our',
+ 'own',
+ 'rather',
+ 'said',
+ 'say',
+ 'says',
+ 'she',
+ 'should',
+ 'since',
+ 'so',
+ 'some',
+ 'than',
+ 'that',
+ 'the',
+ 'their',
+ 'them',
+ 'then',
+ 'there',
+ 'these',
+ 'they',
+ 'this',
+ 'tis',
+ 'to',
+ 'too',
+ 'twas',
+ 'us',
+ 'wants',
+ 'was',
+ 'we',
+ 'were',
+ 'what',
+ 'when',
+ 'where',
+ 'which',
+ 'while',
+ 'who',
+ 'whom',
+ 'why',
+ 'will',
+ 'with',
+ 'would',
+ 'yet',
+ 'you',
+ 'your'
+])
+
+lunr.Pipeline.registerFunction(lunr.stopWordFilter, 'stopWordFilter')
+/*!
+ * lunr.trimmer
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * lunr.trimmer is a pipeline function for trimming non word
+ * characters from the beginning and end of tokens before they
+ * enter the index.
+ *
+ * This implementation may not work correctly for non latin
+ * characters and should either be removed or adapted for use
+ * with languages with non-latin characters.
+ *
+ * @static
+ * @implements {lunr.PipelineFunction}
+ * @param {lunr.Token} token The token to pass through the filter
+ * @returns {lunr.Token}
+ * @see lunr.Pipeline
+ */
+lunr.trimmer = function (token) {
+ return token.update(function (s) {
+ return s.replace(/^\W+/, '').replace(/\W+$/, '')
+ })
+}
+
+lunr.Pipeline.registerFunction(lunr.trimmer, 'trimmer')
+/*!
+ * lunr.TokenSet
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * A token set is used to store the unique list of all tokens
+ * within an index. Token sets are also used to represent an
+ * incoming query to the index, this query token set and index
+ * token set are then intersected to find which tokens to look
+ * up in the inverted index.
+ *
+ * A token set can hold multiple tokens, as in the case of the
+ * index token set, or it can hold a single token as in the
+ * case of a simple query token set.
+ *
+ * Additionally token sets are used to perform wildcard matching.
+ * Leading, contained and trailing wildcards are supported, and
+ * from this edit distance matching can also be provided.
+ *
+ * Token sets are implemented as a minimal finite state automata,
+ * where both common prefixes and suffixes are shared between tokens.
+ * This helps to reduce the space used for storing the token set.
+ *
+ * @constructor
+ */
+lunr.TokenSet = function () {
+ this.final = false
+ this.edges = {}
+ this.id = lunr.TokenSet._nextId
+ lunr.TokenSet._nextId += 1
+}
+
+/**
+ * Keeps track of the next, auto increment, identifier to assign
+ * to a new tokenSet.
+ *
+ * TokenSets require a unique identifier to be correctly minimised.
+ *
+ * @private
+ */
+lunr.TokenSet._nextId = 1
+
+/**
+ * Creates a TokenSet instance from the given sorted array of words.
+ *
+ * @param {String[]} arr - A sorted array of strings to create the set from.
+ * @returns {lunr.TokenSet}
+ * @throws Will throw an error if the input array is not sorted.
+ */
+lunr.TokenSet.fromArray = function (arr) {
+ var builder = new lunr.TokenSet.Builder
+
+ for (var i = 0, len = arr.length; i < len; i++) {
+ builder.insert(arr[i])
+ }
+
+ builder.finish()
+ return builder.root
+}
+
+/**
+ * Creates a token set from a query clause.
+ *
+ * @private
+ * @param {Object} clause - A single clause from lunr.Query.
+ * @param {string} clause.term - The query clause term.
+ * @param {number} [clause.editDistance] - The optional edit distance for the term.
+ * @returns {lunr.TokenSet}
+ */
+lunr.TokenSet.fromClause = function (clause) {
+ if ('editDistance' in clause) {
+ return lunr.TokenSet.fromFuzzyString(clause.term, clause.editDistance)
+ } else {
+ return lunr.TokenSet.fromString(clause.term)
+ }
+}
+
+/**
+ * Creates a token set representing a single string with a specified
+ * edit distance.
+ *
+ * Insertions, deletions, substitutions and transpositions are each
+ * treated as an edit distance of 1.
+ *
+ * Increasing the allowed edit distance will have a dramatic impact
+ * on the performance of both creating and intersecting these TokenSets.
+ * It is advised to keep the edit distance less than 3.
+ *
+ * @param {string} str - The string to create the token set from.
+ * @param {number} editDistance - The allowed edit distance to match.
+ * @returns {lunr.Vector}
+ */
+lunr.TokenSet.fromFuzzyString = function (str, editDistance) {
+ var root = new lunr.TokenSet
+
+ var stack = [{
+ node: root,
+ editsRemaining: editDistance,
+ str: str
+ }]
+
+ while (stack.length) {
+ var frame = stack.pop()
+
+ // no edit
+ if (frame.str.length > 0) {
+ var char = frame.str.charAt(0),
+ noEditNode
+
+ if (char in frame.node.edges) {
+ noEditNode = frame.node.edges[char]
+ } else {
+ noEditNode = new lunr.TokenSet
+ frame.node.edges[char] = noEditNode
+ }
+
+ if (frame.str.length == 1) {
+ noEditNode.final = true
+ }
+
+ stack.push({
+ node: noEditNode,
+ editsRemaining: frame.editsRemaining,
+ str: frame.str.slice(1)
+ })
+ }
+
+ if (frame.editsRemaining == 0) {
+ continue
+ }
+
+ // insertion
+ if ("*" in frame.node.edges) {
+ var insertionNode = frame.node.edges["*"]
+ } else {
+ var insertionNode = new lunr.TokenSet
+ frame.node.edges["*"] = insertionNode
+ }
+
+ if (frame.str.length == 0) {
+ insertionNode.final = true
+ }
+
+ stack.push({
+ node: insertionNode,
+ editsRemaining: frame.editsRemaining - 1,
+ str: frame.str
+ })
+
+ // deletion
+ // can only do a deletion if we have enough edits remaining
+ // and if there are characters left to delete in the string
+ if (frame.str.length > 1) {
+ stack.push({
+ node: frame.node,
+ editsRemaining: frame.editsRemaining - 1,
+ str: frame.str.slice(1)
+ })
+ }
+
+ // deletion
+ // just removing the last character from the str
+ if (frame.str.length == 1) {
+ frame.node.final = true
+ }
+
+ // substitution
+ // can only do a substitution if we have enough edits remaining
+ // and if there are characters left to substitute
+ if (frame.str.length >= 1) {
+ if ("*" in frame.node.edges) {
+ var substitutionNode = frame.node.edges["*"]
+ } else {
+ var substitutionNode = new lunr.TokenSet
+ frame.node.edges["*"] = substitutionNode
+ }
+
+ if (frame.str.length == 1) {
+ substitutionNode.final = true
+ }
+
+ stack.push({
+ node: substitutionNode,
+ editsRemaining: frame.editsRemaining - 1,
+ str: frame.str.slice(1)
+ })
+ }
+
+ // transposition
+ // can only do a transposition if there are edits remaining
+ // and there are enough characters to transpose
+ if (frame.str.length > 1) {
+ var charA = frame.str.charAt(0),
+ charB = frame.str.charAt(1),
+ transposeNode
+
+ if (charB in frame.node.edges) {
+ transposeNode = frame.node.edges[charB]
+ } else {
+ transposeNode = new lunr.TokenSet
+ frame.node.edges[charB] = transposeNode
+ }
+
+ if (frame.str.length == 1) {
+ transposeNode.final = true
+ }
+
+ stack.push({
+ node: transposeNode,
+ editsRemaining: frame.editsRemaining - 1,
+ str: charA + frame.str.slice(2)
+ })
+ }
+ }
+
+ return root
+}
+
+/**
+ * Creates a TokenSet from a string.
+ *
+ * The string may contain one or more wildcard characters (*)
+ * that will allow wildcard matching when intersecting with
+ * another TokenSet.
+ *
+ * @param {string} str - The string to create a TokenSet from.
+ * @returns {lunr.TokenSet}
+ */
+lunr.TokenSet.fromString = function (str) {
+ var node = new lunr.TokenSet,
+ root = node
+
+ /*
+ * Iterates through all characters within the passed string
+ * appending a node for each character.
+ *
+ * When a wildcard character is found then a self
+ * referencing edge is introduced to continually match
+ * any number of any characters.
+ */
+ for (var i = 0, len = str.length; i < len; i++) {
+ var char = str[i],
+ final = (i == len - 1)
+
+ if (char == "*") {
+ node.edges[char] = node
+ node.final = final
+
+ } else {
+ var next = new lunr.TokenSet
+ next.final = final
+
+ node.edges[char] = next
+ node = next
+ }
+ }
+
+ return root
+}
+
+/**
+ * Converts this TokenSet into an array of strings
+ * contained within the TokenSet.
+ *
+ * This is not intended to be used on a TokenSet that
+ * contains wildcards, in these cases the results are
+ * undefined and are likely to cause an infinite loop.
+ *
+ * @returns {string[]}
+ */
+lunr.TokenSet.prototype.toArray = function () {
+ var words = []
+
+ var stack = [{
+ prefix: "",
+ node: this
+ }]
+
+ while (stack.length) {
+ var frame = stack.pop(),
+ edges = Object.keys(frame.node.edges),
+ len = edges.length
+
+ if (frame.node.final) {
+ /* In Safari, at this point the prefix is sometimes corrupted, see:
+ * https://github.com/olivernn/lunr.js/issues/279 Calling any
+ * String.prototype method forces Safari to "cast" this string to what
+ * it's supposed to be, fixing the bug. */
+ frame.prefix.charAt(0)
+ words.push(frame.prefix)
+ }
+
+ for (var i = 0; i < len; i++) {
+ var edge = edges[i]
+
+ stack.push({
+ prefix: frame.prefix.concat(edge),
+ node: frame.node.edges[edge]
+ })
+ }
+ }
+
+ return words
+}
+
+/**
+ * Generates a string representation of a TokenSet.
+ *
+ * This is intended to allow TokenSets to be used as keys
+ * in objects, largely to aid the construction and minimisation
+ * of a TokenSet. As such it is not designed to be a human
+ * friendly representation of the TokenSet.
+ *
+ * @returns {string}
+ */
+lunr.TokenSet.prototype.toString = function () {
+ // NOTE: Using Object.keys here as this.edges is very likely
+ // to enter 'hash-mode' with many keys being added
+ //
+ // avoiding a for-in loop here as it leads to the function
+ // being de-optimised (at least in V8). From some simple
+ // benchmarks the performance is comparable, but allowing
+ // V8 to optimize may mean easy performance wins in the future.
+
+ if (this._str) {
+ return this._str
+ }
+
+ var str = this.final ? '1' : '0',
+ labels = Object.keys(this.edges).sort(),
+ len = labels.length
+
+ for (var i = 0; i < len; i++) {
+ var label = labels[i],
+ node = this.edges[label]
+
+ str = str + label + node.id
+ }
+
+ return str
+}
+
+/**
+ * Returns a new TokenSet that is the intersection of
+ * this TokenSet and the passed TokenSet.
+ *
+ * This intersection will take into account any wildcards
+ * contained within the TokenSet.
+ *
+ * @param {lunr.TokenSet} b - An other TokenSet to intersect with.
+ * @returns {lunr.TokenSet}
+ */
+lunr.TokenSet.prototype.intersect = function (b) {
+ var output = new lunr.TokenSet,
+ frame = undefined
+
+ var stack = [{
+ qNode: b,
+ output: output,
+ node: this
+ }]
+
+ while (stack.length) {
+ frame = stack.pop()
+
+ // NOTE: As with the #toString method, we are using
+ // Object.keys and a for loop instead of a for-in loop
+ // as both of these objects enter 'hash' mode, causing
+ // the function to be de-optimised in V8
+ var qEdges = Object.keys(frame.qNode.edges),
+ qLen = qEdges.length,
+ nEdges = Object.keys(frame.node.edges),
+ nLen = nEdges.length
+
+ for (var q = 0; q < qLen; q++) {
+ var qEdge = qEdges[q]
+
+ for (var n = 0; n < nLen; n++) {
+ var nEdge = nEdges[n]
+
+ if (nEdge == qEdge || qEdge == '*') {
+ var node = frame.node.edges[nEdge],
+ qNode = frame.qNode.edges[qEdge],
+ final = node.final && qNode.final,
+ next = undefined
+
+ if (nEdge in frame.output.edges) {
+ // an edge already exists for this character
+ // no need to create a new node, just set the finality
+ // bit unless this node is already final
+ next = frame.output.edges[nEdge]
+ next.final = next.final || final
+
+ } else {
+ // no edge exists yet, must create one
+ // set the finality bit and insert it
+ // into the output
+ next = new lunr.TokenSet
+ next.final = final
+ frame.output.edges[nEdge] = next
+ }
+
+ stack.push({
+ qNode: qNode,
+ output: next,
+ node: node
+ })
+ }
+ }
+ }
+ }
+
+ return output
+}
+lunr.TokenSet.Builder = function () {
+ this.previousWord = ""
+ this.root = new lunr.TokenSet
+ this.uncheckedNodes = []
+ this.minimizedNodes = {}
+}
+
+lunr.TokenSet.Builder.prototype.insert = function (word) {
+ var node,
+ commonPrefix = 0
+
+ if (word < this.previousWord) {
+ throw new Error ("Out of order word insertion")
+ }
+
+ for (var i = 0; i < word.length && i < this.previousWord.length; i++) {
+ if (word[i] != this.previousWord[i]) break
+ commonPrefix++
+ }
+
+ this.minimize(commonPrefix)
+
+ if (this.uncheckedNodes.length == 0) {
+ node = this.root
+ } else {
+ node = this.uncheckedNodes[this.uncheckedNodes.length - 1].child
+ }
+
+ for (var i = commonPrefix; i < word.length; i++) {
+ var nextNode = new lunr.TokenSet,
+ char = word[i]
+
+ node.edges[char] = nextNode
+
+ this.uncheckedNodes.push({
+ parent: node,
+ char: char,
+ child: nextNode
+ })
+
+ node = nextNode
+ }
+
+ node.final = true
+ this.previousWord = word
+}
+
+lunr.TokenSet.Builder.prototype.finish = function () {
+ this.minimize(0)
+}
+
+lunr.TokenSet.Builder.prototype.minimize = function (downTo) {
+ for (var i = this.uncheckedNodes.length - 1; i >= downTo; i--) {
+ var node = this.uncheckedNodes[i],
+ childKey = node.child.toString()
+
+ if (childKey in this.minimizedNodes) {
+ node.parent.edges[node.char] = this.minimizedNodes[childKey]
+ } else {
+ // Cache the key for this node since
+ // we know it can't change anymore
+ node.child._str = childKey
+
+ this.minimizedNodes[childKey] = node.child
+ }
+
+ this.uncheckedNodes.pop()
+ }
+}
+/*!
+ * lunr.Index
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * An index contains the built index of all documents and provides a query interface
+ * to the index.
+ *
+ * Usually instances of lunr.Index will not be created using this constructor, instead
+ * lunr.Builder should be used to construct new indexes, or lunr.Index.load should be
+ * used to load previously built and serialized indexes.
+ *
+ * @constructor
+ * @param {Object} attrs - The attributes of the built search index.
+ * @param {Object} attrs.invertedIndex - An index of term/field to document reference.
+ * @param {Object} attrs.fieldVectors - Field vectors
+ * @param {lunr.TokenSet} attrs.tokenSet - An set of all corpus tokens.
+ * @param {string[]} attrs.fields - The names of indexed document fields.
+ * @param {lunr.Pipeline} attrs.pipeline - The pipeline to use for search terms.
+ */
+lunr.Index = function (attrs) {
+ this.invertedIndex = attrs.invertedIndex
+ this.fieldVectors = attrs.fieldVectors
+ this.tokenSet = attrs.tokenSet
+ this.fields = attrs.fields
+ this.pipeline = attrs.pipeline
+}
+
+/**
+ * A result contains details of a document matching a search query.
+ * @typedef {Object} lunr.Index~Result
+ * @property {string} ref - The reference of the document this result represents.
+ * @property {number} score - A number between 0 and 1 representing how similar this document is to the query.
+ * @property {lunr.MatchData} matchData - Contains metadata about this match including which term(s) caused the match.
+ */
+
+/**
+ * Although lunr provides the ability to create queries using lunr.Query, it also provides a simple
+ * query language which itself is parsed into an instance of lunr.Query.
+ *
+ * For programmatically building queries it is advised to directly use lunr.Query, the query language
+ * is best used for human entered text rather than program generated text.
+ *
+ * At its simplest queries can just be a single term, e.g. `hello`, multiple terms are also supported
+ * and will be combined with OR, e.g `hello world` will match documents that contain either 'hello'
+ * or 'world', though those that contain both will rank higher in the results.
+ *
+ * Wildcards can be included in terms to match one or more unspecified characters, these wildcards can
+ * be inserted anywhere within the term, and more than one wildcard can exist in a single term. Adding
+ * wildcards will increase the number of documents that will be found but can also have a negative
+ * impact on query performance, especially with wildcards at the beginning of a term.
+ *
+ * Terms can be restricted to specific fields, e.g. `title:hello`, only documents with the term
+ * hello in the title field will match this query. Using a field not present in the index will lead
+ * to an error being thrown.
+ *
+ * Modifiers can also be added to terms, lunr supports edit distance and boost modifiers on terms. A term
+ * boost will make documents matching that term score higher, e.g. `foo^5`. Edit distance is also supported
+ * to provide fuzzy matching, e.g. 'hello~2' will match documents with hello with an edit distance of 2.
+ * Avoid large values for edit distance to improve query performance.
+ *
+ * Each term also supports a presence modifier. By default a term's presence in document is optional, however
+ * this can be changed to either required or prohibited. For a term's presence to be required in a document the
+ * term should be prefixed with a '+', e.g. `+foo bar` is a search for documents that must contain 'foo' and
+ * optionally contain 'bar'. Conversely a leading '-' sets the terms presence to prohibited, i.e. it must not
+ * appear in a document, e.g. `-foo bar` is a search for documents that do not contain 'foo' but may contain 'bar'.
+ *
+ * To escape special characters the backslash character '\' can be used, this allows searches to include
+ * characters that would normally be considered modifiers, e.g. `foo\~2` will search for a term "foo~2" instead
+ * of attempting to apply a boost of 2 to the search term "foo".
+ *
+ * @typedef {string} lunr.Index~QueryString
+ * @example
Simple single term query
+ * hello
+ * @example
Multiple term query
+ * hello world
+ * @example
term scoped to a field
+ * title:hello
+ * @example
term with a boost of 10
+ * hello^10
+ * @example
term with an edit distance of 2
+ * hello~2
+ * @example
terms with presence modifiers
+ * -foo +bar baz
+ */
+
+/**
+ * Performs a search against the index using lunr query syntax.
+ *
+ * Results will be returned sorted by their score, the most relevant results
+ * will be returned first. For details on how the score is calculated, please see
+ * the {@link https://lunrjs.com/guides/searching.html#scoring|guide}.
+ *
+ * For more programmatic querying use lunr.Index#query.
+ *
+ * @param {lunr.Index~QueryString} queryString - A string containing a lunr query.
+ * @throws {lunr.QueryParseError} If the passed query string cannot be parsed.
+ * @returns {lunr.Index~Result[]}
+ */
+lunr.Index.prototype.search = function (queryString) {
+ return this.query(function (query) {
+ var parser = new lunr.QueryParser(queryString, query)
+ parser.parse()
+ })
+}
+
+/**
+ * A query builder callback provides a query object to be used to express
+ * the query to perform on the index.
+ *
+ * @callback lunr.Index~queryBuilder
+ * @param {lunr.Query} query - The query object to build up.
+ * @this lunr.Query
+ */
+
+/**
+ * Performs a query against the index using the yielded lunr.Query object.
+ *
+ * If performing programmatic queries against the index, this method is preferred
+ * over lunr.Index#search so as to avoid the additional query parsing overhead.
+ *
+ * A query object is yielded to the supplied function which should be used to
+ * express the query to be run against the index.
+ *
+ * Note that although this function takes a callback parameter it is _not_ an
+ * asynchronous operation, the callback is just yielded a query object to be
+ * customized.
+ *
+ * @param {lunr.Index~queryBuilder} fn - A function that is used to build the query.
+ * @returns {lunr.Index~Result[]}
+ */
+lunr.Index.prototype.query = function (fn) {
+ // for each query clause
+ // * process terms
+ // * expand terms from token set
+ // * find matching documents and metadata
+ // * get document vectors
+ // * score documents
+
+ var query = new lunr.Query(this.fields),
+ matchingFields = Object.create(null),
+ queryVectors = Object.create(null),
+ termFieldCache = Object.create(null),
+ requiredMatches = Object.create(null),
+ prohibitedMatches = Object.create(null)
+
+ /*
+ * To support field level boosts a query vector is created per
+ * field. An empty vector is eagerly created to support negated
+ * queries.
+ */
+ for (var i = 0; i < this.fields.length; i++) {
+ queryVectors[this.fields[i]] = new lunr.Vector
+ }
+
+ fn.call(query, query)
+
+ for (var i = 0; i < query.clauses.length; i++) {
+ /*
+ * Unless the pipeline has been disabled for this term, which is
+ * the case for terms with wildcards, we need to pass the clause
+ * term through the search pipeline. A pipeline returns an array
+ * of processed terms. Pipeline functions may expand the passed
+ * term, which means we may end up performing multiple index lookups
+ * for a single query term.
+ */
+ var clause = query.clauses[i],
+ terms = null,
+ clauseMatches = lunr.Set.empty
+
+ if (clause.usePipeline) {
+ terms = this.pipeline.runString(clause.term, {
+ fields: clause.fields
+ })
+ } else {
+ terms = [clause.term]
+ }
+
+ for (var m = 0; m < terms.length; m++) {
+ var term = terms[m]
+
+ /*
+ * Each term returned from the pipeline needs to use the same query
+ * clause object, e.g. the same boost and or edit distance. The
+ * simplest way to do this is to re-use the clause object but mutate
+ * its term property.
+ */
+ clause.term = term
+
+ /*
+ * From the term in the clause we create a token set which will then
+ * be used to intersect the indexes token set to get a list of terms
+ * to lookup in the inverted index
+ */
+ var termTokenSet = lunr.TokenSet.fromClause(clause),
+ expandedTerms = this.tokenSet.intersect(termTokenSet).toArray()
+
+ /*
+ * If a term marked as required does not exist in the tokenSet it is
+ * impossible for the search to return any matches. We set all the field
+ * scoped required matches set to empty and stop examining any further
+ * clauses.
+ */
+ if (expandedTerms.length === 0 && clause.presence === lunr.Query.presence.REQUIRED) {
+ for (var k = 0; k < clause.fields.length; k++) {
+ var field = clause.fields[k]
+ requiredMatches[field] = lunr.Set.empty
+ }
+
+ break
+ }
+
+ for (var j = 0; j < expandedTerms.length; j++) {
+ /*
+ * For each term get the posting and termIndex, this is required for
+ * building the query vector.
+ */
+ var expandedTerm = expandedTerms[j],
+ posting = this.invertedIndex[expandedTerm],
+ termIndex = posting._index
+
+ for (var k = 0; k < clause.fields.length; k++) {
+ /*
+ * For each field that this query term is scoped by (by default
+ * all fields are in scope) we need to get all the document refs
+ * that have this term in that field.
+ *
+ * The posting is the entry in the invertedIndex for the matching
+ * term from above.
+ */
+ var field = clause.fields[k],
+ fieldPosting = posting[field],
+ matchingDocumentRefs = Object.keys(fieldPosting),
+ termField = expandedTerm + "/" + field,
+ matchingDocumentsSet = new lunr.Set(matchingDocumentRefs)
+
+ /*
+ * if the presence of this term is required ensure that the matching
+ * documents are added to the set of required matches for this clause.
+ *
+ */
+ if (clause.presence == lunr.Query.presence.REQUIRED) {
+ clauseMatches = clauseMatches.union(matchingDocumentsSet)
+
+ if (requiredMatches[field] === undefined) {
+ requiredMatches[field] = lunr.Set.complete
+ }
+ }
+
+ /*
+ * if the presence of this term is prohibited ensure that the matching
+ * documents are added to the set of prohibited matches for this field,
+ * creating that set if it does not yet exist.
+ */
+ if (clause.presence == lunr.Query.presence.PROHIBITED) {
+ if (prohibitedMatches[field] === undefined) {
+ prohibitedMatches[field] = lunr.Set.empty
+ }
+
+ prohibitedMatches[field] = prohibitedMatches[field].union(matchingDocumentsSet)
+
+ /*
+ * Prohibited matches should not be part of the query vector used for
+ * similarity scoring and no metadata should be extracted so we continue
+ * to the next field
+ */
+ continue
+ }
+
+ /*
+ * The query field vector is populated using the termIndex found for
+ * the term and a unit value with the appropriate boost applied.
+ * Using upsert because there could already be an entry in the vector
+ * for the term we are working with. In that case we just add the scores
+ * together.
+ */
+ queryVectors[field].upsert(termIndex, clause.boost, function (a, b) { return a + b })
+
+ /**
+ * If we've already seen this term, field combo then we've already collected
+ * the matching documents and metadata, no need to go through all that again
+ */
+ if (termFieldCache[termField]) {
+ continue
+ }
+
+ for (var l = 0; l < matchingDocumentRefs.length; l++) {
+ /*
+ * All metadata for this term/field/document triple
+ * are then extracted and collected into an instance
+ * of lunr.MatchData ready to be returned in the query
+ * results
+ */
+ var matchingDocumentRef = matchingDocumentRefs[l],
+ matchingFieldRef = new lunr.FieldRef (matchingDocumentRef, field),
+ metadata = fieldPosting[matchingDocumentRef],
+ fieldMatch
+
+ if ((fieldMatch = matchingFields[matchingFieldRef]) === undefined) {
+ matchingFields[matchingFieldRef] = new lunr.MatchData (expandedTerm, field, metadata)
+ } else {
+ fieldMatch.add(expandedTerm, field, metadata)
+ }
+
+ }
+
+ termFieldCache[termField] = true
+ }
+ }
+ }
+
+ /**
+ * If the presence was required we need to update the requiredMatches field sets.
+ * We do this after all fields for the term have collected their matches because
+ * the clause terms presence is required in _any_ of the fields not _all_ of the
+ * fields.
+ */
+ if (clause.presence === lunr.Query.presence.REQUIRED) {
+ for (var k = 0; k < clause.fields.length; k++) {
+ var field = clause.fields[k]
+ requiredMatches[field] = requiredMatches[field].intersect(clauseMatches)
+ }
+ }
+ }
+
+ /**
+ * Need to combine the field scoped required and prohibited
+ * matching documents into a global set of required and prohibited
+ * matches
+ */
+ var allRequiredMatches = lunr.Set.complete,
+ allProhibitedMatches = lunr.Set.empty
+
+ for (var i = 0; i < this.fields.length; i++) {
+ var field = this.fields[i]
+
+ if (requiredMatches[field]) {
+ allRequiredMatches = allRequiredMatches.intersect(requiredMatches[field])
+ }
+
+ if (prohibitedMatches[field]) {
+ allProhibitedMatches = allProhibitedMatches.union(prohibitedMatches[field])
+ }
+ }
+
+ var matchingFieldRefs = Object.keys(matchingFields),
+ results = [],
+ matches = Object.create(null)
+
+ /*
+ * If the query is negated (contains only prohibited terms)
+ * we need to get _all_ fieldRefs currently existing in the
+ * index. This is only done when we know that the query is
+ * entirely prohibited terms to avoid any cost of getting all
+ * fieldRefs unnecessarily.
+ *
+ * Additionally, blank MatchData must be created to correctly
+ * populate the results.
+ */
+ if (query.isNegated()) {
+ matchingFieldRefs = Object.keys(this.fieldVectors)
+
+ for (var i = 0; i < matchingFieldRefs.length; i++) {
+ var matchingFieldRef = matchingFieldRefs[i]
+ var fieldRef = lunr.FieldRef.fromString(matchingFieldRef)
+ matchingFields[matchingFieldRef] = new lunr.MatchData
+ }
+ }
+
+ for (var i = 0; i < matchingFieldRefs.length; i++) {
+ /*
+ * Currently we have document fields that match the query, but we
+ * need to return documents. The matchData and scores are combined
+ * from multiple fields belonging to the same document.
+ *
+ * Scores are calculated by field, using the query vectors created
+ * above, and combined into a final document score using addition.
+ */
+ var fieldRef = lunr.FieldRef.fromString(matchingFieldRefs[i]),
+ docRef = fieldRef.docRef
+
+ if (!allRequiredMatches.contains(docRef)) {
+ continue
+ }
+
+ if (allProhibitedMatches.contains(docRef)) {
+ continue
+ }
+
+ var fieldVector = this.fieldVectors[fieldRef],
+ score = queryVectors[fieldRef.fieldName].similarity(fieldVector),
+ docMatch
+
+ if ((docMatch = matches[docRef]) !== undefined) {
+ docMatch.score += score
+ docMatch.matchData.combine(matchingFields[fieldRef])
+ } else {
+ var match = {
+ ref: docRef,
+ score: score,
+ matchData: matchingFields[fieldRef]
+ }
+ matches[docRef] = match
+ results.push(match)
+ }
+ }
+
+ /*
+ * Sort the results objects by score, highest first.
+ */
+ return results.sort(function (a, b) {
+ return b.score - a.score
+ })
+}
+
+/**
+ * Prepares the index for JSON serialization.
+ *
+ * The schema for this JSON blob will be described in a
+ * separate JSON schema file.
+ *
+ * @returns {Object}
+ */
+lunr.Index.prototype.toJSON = function () {
+ var invertedIndex = Object.keys(this.invertedIndex)
+ .sort()
+ .map(function (term) {
+ return [term, this.invertedIndex[term]]
+ }, this)
+
+ var fieldVectors = Object.keys(this.fieldVectors)
+ .map(function (ref) {
+ return [ref, this.fieldVectors[ref].toJSON()]
+ }, this)
+
+ return {
+ version: lunr.version,
+ fields: this.fields,
+ fieldVectors: fieldVectors,
+ invertedIndex: invertedIndex,
+ pipeline: this.pipeline.toJSON()
+ }
+}
+
+/**
+ * Loads a previously serialized lunr.Index
+ *
+ * @param {Object} serializedIndex - A previously serialized lunr.Index
+ * @returns {lunr.Index}
+ */
+lunr.Index.load = function (serializedIndex) {
+ var attrs = {},
+ fieldVectors = {},
+ serializedVectors = serializedIndex.fieldVectors,
+ invertedIndex = Object.create(null),
+ serializedInvertedIndex = serializedIndex.invertedIndex,
+ tokenSetBuilder = new lunr.TokenSet.Builder,
+ pipeline = lunr.Pipeline.load(serializedIndex.pipeline)
+
+ if (serializedIndex.version != lunr.version) {
+ lunr.utils.warn("Version mismatch when loading serialised index. Current version of lunr '" + lunr.version + "' does not match serialized index '" + serializedIndex.version + "'")
+ }
+
+ for (var i = 0; i < serializedVectors.length; i++) {
+ var tuple = serializedVectors[i],
+ ref = tuple[0],
+ elements = tuple[1]
+
+ fieldVectors[ref] = new lunr.Vector(elements)
+ }
+
+ for (var i = 0; i < serializedInvertedIndex.length; i++) {
+ var tuple = serializedInvertedIndex[i],
+ term = tuple[0],
+ posting = tuple[1]
+
+ tokenSetBuilder.insert(term)
+ invertedIndex[term] = posting
+ }
+
+ tokenSetBuilder.finish()
+
+ attrs.fields = serializedIndex.fields
+
+ attrs.fieldVectors = fieldVectors
+ attrs.invertedIndex = invertedIndex
+ attrs.tokenSet = tokenSetBuilder.root
+ attrs.pipeline = pipeline
+
+ return new lunr.Index(attrs)
+}
+/*!
+ * lunr.Builder
+ * Copyright (C) 2020 Oliver Nightingale
+ */
+
+/**
+ * lunr.Builder performs indexing on a set of documents and
+ * returns instances of lunr.Index ready for querying.
+ *
+ * All configuration of the index is done via the builder, the
+ * fields to index, the document reference, the text processing
+ * pipeline and document scoring parameters are all set on the
+ * builder before indexing.
+ *
+ * @constructor
+ * @property {string} _ref - Internal reference to the document reference field.
+ * @property {string[]} _fields - Internal reference to the document fields to index.
+ * @property {object} invertedIndex - The inverted index maps terms to document fields.
+ * @property {object} documentTermFrequencies - Keeps track of document term frequencies.
+ * @property {object} documentLengths - Keeps track of the length of documents added to the index.
+ * @property {lunr.tokenizer} tokenizer - Function for splitting strings into tokens for indexing.
+ * @property {lunr.Pipeline} pipeline - The pipeline performs text processing on tokens before indexing.
+ * @property {lunr.Pipeline} searchPipeline - A pipeline for processing search terms before querying the index.
+ * @property {number} documentCount - Keeps track of the total number of documents indexed.
+ * @property {number} _b - A parameter to control field length normalization, setting this to 0 disabled normalization, 1 fully normalizes field lengths, the default value is 0.75.
+ * @property {number} _k1 - A parameter to control how quickly an increase in term frequency results in term frequency saturation, the default value is 1.2.
+ * @property {number} termIndex - A counter incremented for each unique term, used to identify a terms position in the vector space.
+ * @property {array} metadataWhitelist - A list of metadata keys that have been whitelisted for entry in the index.
+ */
+lunr.Builder = function () {
+ this._ref = "id"
+ this._fields = Object.create(null)
+ this._documents = Object.create(null)
+ this.invertedIndex = Object.create(null)
+ this.fieldTermFrequencies = {}
+ this.fieldLengths = {}
+ this.tokenizer = lunr.tokenizer
+ this.pipeline = new lunr.Pipeline
+ this.searchPipeline = new lunr.Pipeline
+ this.documentCount = 0
+ this._b = 0.75
+ this._k1 = 1.2
+ this.termIndex = 0
+ this.metadataWhitelist = []
+}
+
+/**
+ * Sets the document field used as the document reference. Every document must have this field.
+ * The type of this field in the document should be a string, if it is not a string it will be
+ * coerced into a string by calling toString.
+ *
+ * The default ref is 'id'.
+ *
+ * The ref should _not_ be changed during indexing, it should be set before any documents are
+ * added to the index. Changing it during indexing can lead to inconsistent results.
+ *
+ * @param {string} ref - The name of the reference field in the document.
+ */
+lunr.Builder.prototype.ref = function (ref) {
+ this._ref = ref
+}
+
+/**
+ * A function that is used to extract a field from a document.
+ *
+ * Lunr expects a field to be at the top level of a document, if however the field
+ * is deeply nested within a document an extractor function can be used to extract
+ * the right field for indexing.
+ *
+ * @callback fieldExtractor
+ * @param {object} doc - The document being added to the index.
+ * @returns {?(string|object|object[])} obj - The object that will be indexed for this field.
+ * @example
Extracting a nested field
+ * function (doc) { return doc.nested.field }
+ */
+
+/**
+ * Adds a field to the list of document fields that will be indexed. Every document being
+ * indexed should have this field. Null values for this field in indexed documents will
+ * not cause errors but will limit the chance of that document being retrieved by searches.
+ *
+ * All fields should be added before adding documents to the index. Adding fields after
+ * a document has been indexed will have no effect on already indexed documents.
+ *
+ * Fields can be boosted at build time. This allows terms within that field to have more
+ * importance when ranking search results. Use a field boost to specify that matches within
+ * one field are more important than other fields.
+ *
+ * @param {string} fieldName - The name of a field to index in all documents.
+ * @param {object} attributes - Optional attributes associated with this field.
+ * @param {number} [attributes.boost=1] - Boost applied to all terms within this field.
+ * @param {fieldExtractor} [attributes.extractor] - Function to extract a field from a document.
+ * @throws {RangeError} fieldName cannot contain unsupported characters '/'
+ */
+lunr.Builder.prototype.field = function (fieldName, attributes) {
+ if (/\//.test(fieldName)) {
+ throw new RangeError ("Field '" + fieldName + "' contains illegal character '/'")
+ }
+
+ this._fields[fieldName] = attributes || {}
+}
+
+/**
+ * A parameter to tune the amount of field length normalisation that is applied when
+ * calculating relevance scores. A value of 0 will completely disable any normalisation
+ * and a value of 1 will fully normalise field lengths. The default is 0.75. Values of b
+ * will be clamped to the range 0 - 1.
+ *
+ * @param {number} number - The value to set for this tuning parameter.
+ */
+lunr.Builder.prototype.b = function (number) {
+ if (number < 0) {
+ this._b = 0
+ } else if (number > 1) {
+ this._b = 1
+ } else {
+ this._b = number
+ }
+}
+
+/**
+ * A parameter that controls the speed at which a rise in term frequency results in term
+ * frequency saturation. The default value is 1.2. Setting this to a higher value will give
+ * slower saturation levels, a lower value will result in quicker saturation.
+ *
+ * @param {number} number - The value to set for this tuning parameter.
+ */
+lunr.Builder.prototype.k1 = function (number) {
+ this._k1 = number
+}
+
+/**
+ * Adds a document to the index.
+ *
+ * Before adding fields to the index the index should have been fully setup, with the document
+ * ref and all fields to index already having been specified.
+ *
+ * The document must have a field name as specified by the ref (by default this is 'id') and
+ * it should have all fields defined for indexing, though null or undefined values will not
+ * cause errors.
+ *
+ * Entire documents can be boosted at build time. Applying a boost to a document indicates that
+ * this document should rank higher in search results than other documents.
+ *
+ * @param {object} doc - The document to add to the index.
+ * @param {object} attributes - Optional attributes associated with this document.
+ * @param {number} [attributes.boost=1] - Boost applied to all terms within this document.
+ */
+lunr.Builder.prototype.add = function (doc, attributes) {
+ var docRef = doc[this._ref],
+ fields = Object.keys(this._fields)
+
+ this._documents[docRef] = attributes || {}
+ this.documentCount += 1
+
+ for (var i = 0; i < fields.length; i++) {
+ var fieldName = fields[i],
+ extractor = this._fields[fieldName].extractor,
+ field = extractor ? extractor(doc) : doc[fieldName],
+ tokens = this.tokenizer(field, {
+ fields: [fieldName]
+ }),
+ terms = this.pipeline.run(tokens),
+ fieldRef = new lunr.FieldRef (docRef, fieldName),
+ fieldTerms = Object.create(null)
+
+ this.fieldTermFrequencies[fieldRef] = fieldTerms
+ this.fieldLengths[fieldRef] = 0
+
+ // store the length of this field for this document
+ this.fieldLengths[fieldRef] += terms.length
+
+ // calculate term frequencies for this field
+ for (var j = 0; j < terms.length; j++) {
+ var term = terms[j]
+
+ if (fieldTerms[term] == undefined) {
+ fieldTerms[term] = 0
+ }
+
+ fieldTerms[term] += 1
+
+ // add to inverted index
+ // create an initial posting if one doesn't exist
+ if (this.invertedIndex[term] == undefined) {
+ var posting = Object.create(null)
+ posting["_index"] = this.termIndex
+ this.termIndex += 1
+
+ for (var k = 0; k < fields.length; k++) {
+ posting[fields[k]] = Object.create(null)
+ }
+
+ this.invertedIndex[term] = posting
+ }
+
+ // add an entry for this term/fieldName/docRef to the invertedIndex
+ if (this.invertedIndex[term][fieldName][docRef] == undefined) {
+ this.invertedIndex[term][fieldName][docRef] = Object.create(null)
+ }
+
+ // store all whitelisted metadata about this token in the
+ // inverted index
+ for (var l = 0; l < this.metadataWhitelist.length; l++) {
+ var metadataKey = this.metadataWhitelist[l],
+ metadata = term.metadata[metadataKey]
+
+ if (this.invertedIndex[term][fieldName][docRef][metadataKey] == undefined) {
+ this.invertedIndex[term][fieldName][docRef][metadataKey] = []
+ }
+
+ this.invertedIndex[term][fieldName][docRef][metadataKey].push(metadata)
+ }
+ }
+
+ }
+}
+
+/**
+ * Calculates the average document length for this index
+ *
+ * @private
+ */
+lunr.Builder.prototype.calculateAverageFieldLengths = function () {
+
+ var fieldRefs = Object.keys(this.fieldLengths),
+ numberOfFields = fieldRefs.length,
+ accumulator = {},
+ documentsWithField = {}
+
+ for (var i = 0; i < numberOfFields; i++) {
+ var fieldRef = lunr.FieldRef.fromString(fieldRefs[i]),
+ field = fieldRef.fieldName
+
+ documentsWithField[field] || (documentsWithField[field] = 0)
+ documentsWithField[field] += 1
+
+ accumulator[field] || (accumulator[field] = 0)
+ accumulator[field] += this.fieldLengths[fieldRef]
+ }
+
+ var fields = Object.keys(this._fields)
+
+ for (var i = 0; i < fields.length; i++) {
+ var fieldName = fields[i]
+ accumulator[fieldName] = accumulator[fieldName] / documentsWithField[fieldName]
+ }
+
+ this.averageFieldLength = accumulator
+}
+
+/**
+ * Builds a vector space model of every document using lunr.Vector
+ *
+ * @private
+ */
+lunr.Builder.prototype.createFieldVectors = function () {
+ var fieldVectors = {},
+ fieldRefs = Object.keys(this.fieldTermFrequencies),
+ fieldRefsLength = fieldRefs.length,
+ termIdfCache = Object.create(null)
+
+ for (var i = 0; i < fieldRefsLength; i++) {
+ var fieldRef = lunr.FieldRef.fromString(fieldRefs[i]),
+ fieldName = fieldRef.fieldName,
+ fieldLength = this.fieldLengths[fieldRef],
+ fieldVector = new lunr.Vector,
+ termFrequencies = this.fieldTermFrequencies[fieldRef],
+ terms = Object.keys(termFrequencies),
+ termsLength = terms.length
+
+
+ var fieldBoost = this._fields[fieldName].boost || 1,
+ docBoost = this._documents[fieldRef.docRef].boost || 1
+
+ for (var j = 0; j < termsLength; j++) {
+ var term = terms[j],
+ tf = termFrequencies[term],
+ termIndex = this.invertedIndex[term]._index,
+ idf, score, scoreWithPrecision
+
+ if (termIdfCache[term] === undefined) {
+ idf = lunr.idf(this.invertedIndex[term], this.documentCount)
+ termIdfCache[term] = idf
+ } else {
+ idf = termIdfCache[term]
+ }
+
+ score = idf * ((this._k1 + 1) * tf) / (this._k1 * (1 - this._b + this._b * (fieldLength / this.averageFieldLength[fieldName])) + tf)
+ score *= fieldBoost
+ score *= docBoost
+ scoreWithPrecision = Math.round(score * 1000) / 1000
+ // Converts 1.23456789 to 1.234.
+ // Reducing the precision so that the vectors take up less
+ // space when serialised. Doing it now so that they behave
+ // the same before and after serialisation. Also, this is
+ // the fastest approach to reducing a number's precision in
+ // JavaScript.
+
+ fieldVector.insert(termIndex, scoreWithPrecision)
+ }
+
+ fieldVectors[fieldRef] = fieldVector
+ }
+
+ this.fieldVectors = fieldVectors
+}
+
+/**
+ * Creates a token set of all tokens in the index using lunr.TokenSet
+ *
+ * @private
+ */
+lunr.Builder.prototype.createTokenSet = function () {
+ this.tokenSet = lunr.TokenSet.fromArray(
+ Object.keys(this.invertedIndex).sort()
+ )
+}
+
+/**
+ * Builds the index, creating an instance of lunr.Index.
+ *
+ * This completes the indexing process and should only be called
+ * once all documents have been added to the index.
+ *
+ * @returns {lunr.Index}
+ */
+lunr.Builder.prototype.build = function () {
+ this.calculateAverageFieldLengths()
+ this.createFieldVectors()
+ this.createTokenSet()
+
+ return new lunr.Index({
+ invertedIndex: this.invertedIndex,
+ fieldVectors: this.fieldVectors,
+ tokenSet: this.tokenSet,
+ fields: Object.keys(this._fields),
+ pipeline: this.searchPipeline
+ })
+}
+
+/**
+ * Applies a plugin to the index builder.
+ *
+ * A plugin is a function that is called with the index builder as its context.
+ * Plugins can be used to customise or extend the behaviour of the index
+ * in some way. A plugin is just a function, that encapsulated the custom
+ * behaviour that should be applied when building the index.
+ *
+ * The plugin function will be called with the index builder as its argument, additional
+ * arguments can also be passed when calling use. The function will be called
+ * with the index builder as its context.
+ *
+ * @param {Function} plugin The plugin to apply.
+ */
+lunr.Builder.prototype.use = function (fn) {
+ var args = Array.prototype.slice.call(arguments, 1)
+ args.unshift(this)
+ fn.apply(this, args)
+}
+/**
+ * Contains and collects metadata about a matching document.
+ * A single instance of lunr.MatchData is returned as part of every
+ * lunr.Index~Result.
+ *
+ * @constructor
+ * @param {string} term - The term this match data is associated with
+ * @param {string} field - The field in which the term was found
+ * @param {object} metadata - The metadata recorded about this term in this field
+ * @property {object} metadata - A cloned collection of metadata associated with this document.
+ * @see {@link lunr.Index~Result}
+ */
+lunr.MatchData = function (term, field, metadata) {
+ var clonedMetadata = Object.create(null),
+ metadataKeys = Object.keys(metadata || {})
+
+ // Cloning the metadata to prevent the original
+ // being mutated during match data combination.
+ // Metadata is kept in an array within the inverted
+ // index so cloning the data can be done with
+ // Array#slice
+ for (var i = 0; i < metadataKeys.length; i++) {
+ var key = metadataKeys[i]
+ clonedMetadata[key] = metadata[key].slice()
+ }
+
+ this.metadata = Object.create(null)
+
+ if (term !== undefined) {
+ this.metadata[term] = Object.create(null)
+ this.metadata[term][field] = clonedMetadata
+ }
+}
+
+/**
+ * An instance of lunr.MatchData will be created for every term that matches a
+ * document. However only one instance is required in a lunr.Index~Result. This
+ * method combines metadata from another instance of lunr.MatchData with this
+ * objects metadata.
+ *
+ * @param {lunr.MatchData} otherMatchData - Another instance of match data to merge with this one.
+ * @see {@link lunr.Index~Result}
+ */
+lunr.MatchData.prototype.combine = function (otherMatchData) {
+ var terms = Object.keys(otherMatchData.metadata)
+
+ for (var i = 0; i < terms.length; i++) {
+ var term = terms[i],
+ fields = Object.keys(otherMatchData.metadata[term])
+
+ if (this.metadata[term] == undefined) {
+ this.metadata[term] = Object.create(null)
+ }
+
+ for (var j = 0; j < fields.length; j++) {
+ var field = fields[j],
+ keys = Object.keys(otherMatchData.metadata[term][field])
+
+ if (this.metadata[term][field] == undefined) {
+ this.metadata[term][field] = Object.create(null)
+ }
+
+ for (var k = 0; k < keys.length; k++) {
+ var key = keys[k]
+
+ if (this.metadata[term][field][key] == undefined) {
+ this.metadata[term][field][key] = otherMatchData.metadata[term][field][key]
+ } else {
+ this.metadata[term][field][key] = this.metadata[term][field][key].concat(otherMatchData.metadata[term][field][key])
+ }
+
+ }
+ }
+ }
+}
+
+/**
+ * Add metadata for a term/field pair to this instance of match data.
+ *
+ * @param {string} term - The term this match data is associated with
+ * @param {string} field - The field in which the term was found
+ * @param {object} metadata - The metadata recorded about this term in this field
+ */
+lunr.MatchData.prototype.add = function (term, field, metadata) {
+ if (!(term in this.metadata)) {
+ this.metadata[term] = Object.create(null)
+ this.metadata[term][field] = metadata
+ return
+ }
+
+ if (!(field in this.metadata[term])) {
+ this.metadata[term][field] = metadata
+ return
+ }
+
+ var metadataKeys = Object.keys(metadata)
+
+ for (var i = 0; i < metadataKeys.length; i++) {
+ var key = metadataKeys[i]
+
+ if (key in this.metadata[term][field]) {
+ this.metadata[term][field][key] = this.metadata[term][field][key].concat(metadata[key])
+ } else {
+ this.metadata[term][field][key] = metadata[key]
+ }
+ }
+}
+/**
+ * A lunr.Query provides a programmatic way of defining queries to be performed
+ * against a {@link lunr.Index}.
+ *
+ * Prefer constructing a lunr.Query using the {@link lunr.Index#query} method
+ * so the query object is pre-initialized with the right index fields.
+ *
+ * @constructor
+ * @property {lunr.Query~Clause[]} clauses - An array of query clauses.
+ * @property {string[]} allFields - An array of all available fields in a lunr.Index.
+ */
+lunr.Query = function (allFields) {
+ this.clauses = []
+ this.allFields = allFields
+}
+
+/**
+ * Constants for indicating what kind of automatic wildcard insertion will be used when constructing a query clause.
+ *
+ * This allows wildcards to be added to the beginning and end of a term without having to manually do any string
+ * concatenation.
+ *
+ * The wildcard constants can be bitwise combined to select both leading and trailing wildcards.
+ *
+ * @constant
+ * @default
+ * @property {number} wildcard.NONE - The term will have no wildcards inserted, this is the default behaviour
+ * @property {number} wildcard.LEADING - Prepend the term with a wildcard, unless a leading wildcard already exists
+ * @property {number} wildcard.TRAILING - Append a wildcard to the term, unless a trailing wildcard already exists
+ * @see lunr.Query~Clause
+ * @see lunr.Query#clause
+ * @see lunr.Query#term
+ * @example
+ * query.term('foo', {
+ * wildcard: lunr.Query.wildcard.LEADING | lunr.Query.wildcard.TRAILING
+ * })
+ */
+
+lunr.Query.wildcard = new String ("*")
+lunr.Query.wildcard.NONE = 0
+lunr.Query.wildcard.LEADING = 1
+lunr.Query.wildcard.TRAILING = 2
+
+/**
+ * Constants for indicating what kind of presence a term must have in matching documents.
+ *
+ * @constant
+ * @enum {number}
+ * @see lunr.Query~Clause
+ * @see lunr.Query#clause
+ * @see lunr.Query#term
+ * @example
query term with required presence
+ * query.term('foo', { presence: lunr.Query.presence.REQUIRED })
+ */
+lunr.Query.presence = {
+ /**
+ * Term's presence in a document is optional, this is the default value.
+ */
+ OPTIONAL: 1,
+
+ /**
+ * Term's presence in a document is required, documents that do not contain
+ * this term will not be returned.
+ */
+ REQUIRED: 2,
+
+ /**
+ * Term's presence in a document is prohibited, documents that do contain
+ * this term will not be returned.
+ */
+ PROHIBITED: 3
+}
+
+/**
+ * A single clause in a {@link lunr.Query} contains a term and details on how to
+ * match that term against a {@link lunr.Index}.
+ *
+ * @typedef {Object} lunr.Query~Clause
+ * @property {string[]} fields - The fields in an index this clause should be matched against.
+ * @property {number} [boost=1] - Any boost that should be applied when matching this clause.
+ * @property {number} [editDistance] - Whether the term should have fuzzy matching applied, and how fuzzy the match should be.
+ * @property {boolean} [usePipeline] - Whether the term should be passed through the search pipeline.
+ * @property {number} [wildcard=lunr.Query.wildcard.NONE] - Whether the term should have wildcards appended or prepended.
+ * @property {number} [presence=lunr.Query.presence.OPTIONAL] - The terms presence in any matching documents.
+ */
+
+/**
+ * Adds a {@link lunr.Query~Clause} to this query.
+ *
+ * Unless the clause contains the fields to be matched all fields will be matched. In addition
+ * a default boost of 1 is applied to the clause.
+ *
+ * @param {lunr.Query~Clause} clause - The clause to add to this query.
+ * @see lunr.Query~Clause
+ * @returns {lunr.Query}
+ */
+lunr.Query.prototype.clause = function (clause) {
+ if (!('fields' in clause)) {
+ clause.fields = this.allFields
+ }
+
+ if (!('boost' in clause)) {
+ clause.boost = 1
+ }
+
+ if (!('usePipeline' in clause)) {
+ clause.usePipeline = true
+ }
+
+ if (!('wildcard' in clause)) {
+ clause.wildcard = lunr.Query.wildcard.NONE
+ }
+
+ if ((clause.wildcard & lunr.Query.wildcard.LEADING) && (clause.term.charAt(0) != lunr.Query.wildcard)) {
+ clause.term = "*" + clause.term
+ }
+
+ if ((clause.wildcard & lunr.Query.wildcard.TRAILING) && (clause.term.slice(-1) != lunr.Query.wildcard)) {
+ clause.term = "" + clause.term + "*"
+ }
+
+ if (!('presence' in clause)) {
+ clause.presence = lunr.Query.presence.OPTIONAL
+ }
+
+ this.clauses.push(clause)
+
+ return this
+}
+
+/**
+ * A negated query is one in which every clause has a presence of
+ * prohibited. These queries require some special processing to return
+ * the expected results.
+ *
+ * @returns boolean
+ */
+lunr.Query.prototype.isNegated = function () {
+ for (var i = 0; i < this.clauses.length; i++) {
+ if (this.clauses[i].presence != lunr.Query.presence.PROHIBITED) {
+ return false
+ }
+ }
+
+ return true
+}
+
+/**
+ * Adds a term to the current query, under the covers this will create a {@link lunr.Query~Clause}
+ * to the list of clauses that make up this query.
+ *
+ * The term is used as is, i.e. no tokenization will be performed by this method. Instead conversion
+ * to a token or token-like string should be done before calling this method.
+ *
+ * The term will be converted to a string by calling `toString`. Multiple terms can be passed as an
+ * array, each term in the array will share the same options.
+ *
+ * @param {object|object[]} term - The term(s) to add to the query.
+ * @param {object} [options] - Any additional properties to add to the query clause.
+ * @returns {lunr.Query}
+ * @see lunr.Query#clause
+ * @see lunr.Query~Clause
+ * @example
adding a single term to a query
+ * query.term("foo")
+ * @example
adding a single term to a query and specifying search fields, term boost and automatic trailing wildcard
")),e.inlineElement=o}return h.updateStatus("ready"),h._parseMarkup(t,{},e),t}}});function N(){I&&l(document.body).removeClass(I)}function j(){N(),h.req&&h.req.abort()}var I,L="ajax";l.magnificPopup.registerModule(L,{options:{settings:null,cursor:"mfp-ajax-cur",tError:'The content could not be loaded.'},proto:{initAjax:function(){h.types.push(L),I=h.st.ajax.cursor,c(u+"."+L,j),c("BeforeChange."+L,j)},getAjax:function(r){I&&l(document.body).addClass(I),h.updateStatus("loading");var e=l.extend({url:r.src,success:function(e,t,n){n={data:e,xhr:n};d("ParseAjax",n),h.appendContent(l(n.data),L),r.finished=!0,N(),h._setFocus(),setTimeout(function(){h.wrap.addClass(w)},16),h.updateStatus("ready"),d("AjaxContentAdded")},error:function(){N(),r.finished=r.loadError=!0,h.updateStatus("error",h.st.ajax.tError.replace("%url%",r.src))}},h.st.ajax.settings);return h.req=l.ajax(e),""}}});var D;l.magnificPopup.registerModule("image",{options:{markup:'
',cursor:"mfp-zoom-out-cur",titleSrc:"title",verticalFit:!0,tError:'The image could not be loaded.'},proto:{initImage:function(){var e=h.st.image,t=".image";h.types.push("image"),c(b+t,function(){"image"===h.currItem.type&&e.cursor&&l(document.body).addClass(e.cursor)}),c(u+t,function(){e.cursor&&l(document.body).removeClass(e.cursor),C.off("resize"+x)}),c("Resize"+t,h.resizeImage),h.isLowIE&&c("AfterChange",h.resizeImage)},resizeImage:function(){var e,t=h.currItem;t&&t.img&&h.st.image.verticalFit&&(e=0,h.isLowIE&&(e=parseInt(t.img.css("padding-top"),10)+parseInt(t.img.css("padding-bottom"),10)),t.img.css("max-height",h.wH-e))},_onImageHasSize:function(e){e.img&&(e.hasSize=!0,D&&clearInterval(D),e.isCheckingImgSize=!1,d("ImageHasSize",e),e.imgHidden&&(h.content&&h.content.removeClass("mfp-loading"),e.imgHidden=!1))},findImageSize:function(t){var n=0,r=t.img[0],o=function(e){D&&clearInterval(D),D=setInterval(function(){0
',srcAction:"iframe_src",patterns:{youtube:{index:"youtube.com",id:"v=",src:"//www.youtube.com/embed/%id%?autoplay=1"},vimeo:{index:"vimeo.com/",id:"/",src:"//player.vimeo.com/video/%id%?autoplay=1"},gmaps:{index:"//maps.google.",src:"%id%&output=embed"}}},proto:{initIframe:function(){h.types.push(P),c("BeforeChange",function(e,t,n){t!==n&&(t===P?H():n===P&&H(!0))}),c(u+"."+P,function(){H()})},getIframe:function(e,t){var n=e.src,r=h.st.iframe;l.each(r.patterns,function(){if(-1',preload:[0,2],navigateByImgClick:!0,arrows:!0,tPrev:"Previous (Left arrow key)",tNext:"Next (Right arrow key)",tCounter:"%curr% of %total%"},proto:{initGallery:function(){var i=h.st.gallery,e=".mfp-gallery";if(h.direction=!0,!i||!i.enabled)return!1;g+=" mfp-gallery",c(b+e,function(){i.navigateByImgClick&&h.wrap.on("click"+e,".mfp-img",function(){if(1=h.index,h.index=e,h.updateItemHTML()},preloadNearbyImages:function(){for(var e=h.st.gallery.preload,t=Math.min(e[0],h.items.length),n=Math.min(e[1],h.items.length),r=1;r<=(h.direction?n:t);r++)h._preloadItem(h.index+r);for(r=1;r<=(h.direction?t:n);r++)h._preloadItem(h.index-r)},_preloadItem:function(e){var t;e=q(e),h.items[e].preloaded||((t=h.items[e]).parsed||(t=h.parseEl(e)),d("LazyLoad",t),"image"===t.type&&(t.img=l('').on("load.mfploader",function(){t.hasSize=!0}).on("error.mfploader",function(){t.hasSize=!0,t.loadError=!0,d("LazyLoadError",t)}).attr("src",t.src)),t.preloaded=!0)}}});var _="retina";l.magnificPopup.registerModule(_,{options:{replaceSrc:function(e){return e.src.replace(/\.\w+$/,function(e){return"@2x"+e})},ratio:1},proto:{initRetina:function(){var n,r;1t.durationMax?t.durationMax:t.durationMin&&e=u)return b.cancelScroll(!0),e=t,n=g,0===(t=r)&&document.body.focus(),n||(t.focus(),document.activeElement!==t&&(t.setAttribute("tabindex","-1"),t.focus(),t.style.outline="none"),x.scrollTo(0,e)),E("scrollStop",m,r,o),!(y=f=null)},h=function(e){var t,n,r;l+=e-(f=f||e),d=i+s*(n=d=1<(d=0===c?0:l/c)?1:d,"easeInQuad"===(t=m).easing&&(r=n*n),"easeOutQuad"===t.easing&&(r=n*(2-n)),"easeInOutQuad"===t.easing&&(r=n<.5?2*n*n:(4-2*n)*n-1),"easeInCubic"===t.easing&&(r=n*n*n),"easeOutCubic"===t.easing&&(r=--n*n*n+1),"easeInOutCubic"===t.easing&&(r=n<.5?4*n*n*n:(n-1)*(2*n-2)*(2*n-2)+1),"easeInQuart"===t.easing&&(r=n*n*n*n),"easeOutQuart"===t.easing&&(r=1- --n*n*n*n),"easeInOutQuart"===t.easing&&(r=n<.5?8*n*n*n*n:1-8*--n*n*n*n),"easeInQuint"===t.easing&&(r=n*n*n*n*n),"easeOutQuint"===t.easing&&(r=1+--n*n*n*n*n),"easeInOutQuint"===t.easing&&(r=n<.5?16*n*n*n*n*n:1+16*--n*n*n*n*n),(r=t.customEasing?t.customEasing(n):r)||n),x.scrollTo(0,Math.floor(d)),p(d,a)||(y=x.requestAnimationFrame(h),f=e)},0===x.pageYOffset&&x.scrollTo(0,0),t=r,e=m,g||history.pushState&&e.updateURL&&history.pushState({smoothScroll:JSON.stringify(e),anchor:t.id},document.title,t===document.documentElement?"#top":"#"+t.id),"matchMedia"in x&&x.matchMedia("(prefers-reduced-motion)").matches?x.scrollTo(0,Math.floor(a)):(E("scrollStart",m,r,o),b.cancelScroll(!0),x.requestAnimationFrame(h)))};function t(e){if(!e.defaultPrevented&&!(0!==e.button||e.metaKey||e.ctrlKey||e.shiftKey)&&"closest"in e.target&&(o=e.target.closest(r))&&"a"===o.tagName.toLowerCase()&&!e.target.closest(v.ignore)&&o.hostname===x.location.hostname&&o.pathname===x.location.pathname&&/#/.test(o.href)){var t,n;try{n=a(decodeURIComponent(o.hash))}catch(e){n=a(o.hash)}if("#"===n){if(!v.topOnEmptyHash)return;t=document.documentElement}else t=document.querySelector(n);(t=t||"#top"!==n?t:document.documentElement)&&(e.preventDefault(),n=v,history.replaceState&&n.updateURL&&!history.state&&(e=(e=x.location.hash)||"",history.replaceState({smoothScroll:JSON.stringify(n),anchor:e||x.pageYOffset},document.title,e||x.location.href)),b.animateScroll(t,o))}}function i(e){var t;null!==history.state&&history.state.smoothScroll&&history.state.smoothScroll===JSON.stringify(v)&&("string"==typeof(t=history.state.anchor)&&t&&!(t=document.querySelector(a(history.state.anchor)))||b.animateScroll(t,null,{updateURL:!1}))}b.destroy=function(){v&&(document.removeEventListener("click",t,!1),x.removeEventListener("popstate",i,!1),b.cancelScroll(),y=n=o=v=null)};return function(){if(!("querySelector"in document&&"addEventListener"in x&&"requestAnimationFrame"in x&&"closest"in x.Element.prototype))throw"Smooth Scroll: This browser does not support the required JavaScript methods and browser APIs.";b.destroy(),v=w(S,e||{}),n=v.header?document.querySelector(v.header):null,document.addEventListener("click",t,!1),v.updateURL&&v.popstate&&x.addEventListener("popstate",i,!1)}(),b}}),function(e,t){"function"==typeof define&&define.amd?define([],function(){return t(e)}):"object"==typeof exports?module.exports=t(e):e.Gumshoe=t(e)}("undefined"!=typeof global?global:"undefined"!=typeof window?window:this,function(c){"use strict";function f(e,t,n){n.settings.events&&(n=new CustomEvent(e,{bubbles:!0,cancelable:!0,detail:n}),t.dispatchEvent(n))}function n(e){var t=0;if(e.offsetParent)for(;e;)t+=e.offsetTop,e=e.offsetParent;return 0<=t?t:0}function d(e){e&&e.sort(function(e,t){return n(e.content)=Math.max(document.body.scrollHeight,document.documentElement.scrollHeight,document.body.offsetHeight,document.documentElement.offsetHeight,document.body.clientHeight,document.documentElement.clientHeight)}function p(e,t){var n,r,o=e[e.length-1];if(n=o,r=t,!(!s()||!a(n.content,r,!0)))return o;for(var i=e.length-1;0<=i;i--)if(a(e[i].content,t))return e[i]}function h(e,t){var n;!e||(n=e.nav.closest("li"))&&(n.classList.remove(t.navClass),e.content.classList.remove(t.contentClass),r(n,t),f("gumshoeDeactivate",n,{link:e.nav,content:e.content,settings:t}))}var m={navClass:"active",contentClass:"active",nested:!1,nestedClass:"active",offset:0,reflow:!1,events:!0},r=function(e,t){!t.nested||(e=e.parentNode.closest("li"))&&(e.classList.remove(t.nestedClass),r(e,t))},g=function(e,t){!t.nested||(e=e.parentNode.closest("li"))&&(e.classList.add(t.nestedClass),g(e,t))};return function(e,t){var n,o,i,r,a,s={setup:function(){n=document.querySelectorAll(e),o=[],Array.prototype.forEach.call(n,function(e){var t=document.getElementById(decodeURIComponent(e.hash.substr(1)));t&&o.push({nav:e,content:t})}),d(o)}};s.detect=function(){var e,t,n,r=p(o,a);r?i&&r.content===i.content||(h(i,a),t=a,!(e=r)||(n=e.nav.closest("li"))&&(n.classList.add(t.navClass),e.content.classList.add(t.contentClass),g(n,t),f("gumshoeActivate",n,{link:e.nav,content:e.content,settings:t})),i=r):i&&(h(i,a),i=null)};function u(e){r&&c.cancelAnimationFrame(r),r=c.requestAnimationFrame(s.detect)}function l(e){r&&c.cancelAnimationFrame(r),r=c.requestAnimationFrame(function(){d(o),s.detect()})}s.destroy=function(){i&&h(i,a),c.removeEventListener("scroll",u,!1),a.reflow&&c.removeEventListener("resize",l,!1),a=r=i=n=o=null};return a=function(){var n={};return Array.prototype.forEach.call(arguments,function(e){for(var t in e){if(!e.hasOwnProperty(t))return;n[t]=e[t]}}),n}(m,t||{}),s.setup(),s.detect(),c.addEventListener("scroll",u,!1),a.reflow&&c.addEventListener("resize",l,!1),s}}),$(document).ready(function(){$("#main").fitVids();function e(){(0===$(".author__urls-wrapper button").length?1024<$(window).width():!$(".author__urls-wrapper button").is(":visible"))?$(".sidebar").addClass("sticky"):$(".sidebar").removeClass("sticky")}e(),$(window).resize(function(){e()}),$(".author__urls-wrapper button").on("click",function(){$(".author__urls").toggleClass("is--visible"),$(".author__urls-wrapper button").toggleClass("open")}),$(document).keyup(function(e){27===e.keyCode&&$(".initial-content").hasClass("is--hidden")&&($(".search-content").toggleClass("is--visible"),$(".initial-content").toggleClass("is--hidden"))}),$(".search__toggle").on("click",function(){$(".search-content").toggleClass("is--visible"),$(".initial-content").toggleClass("is--hidden"),setTimeout(function(){$(".search-content input").focus()},400)});new SmoothScroll('a[href*="#"]',{offset:20,speed:400,speedAsDuration:!0,durationMax:500});0<$("nav.toc").length&&new Gumshoe("nav.toc a",{navClass:"active",contentClass:"active",nested:!1,nestedClass:"active",offset:20,reflow:!0,events:!0}),$("a[href$='.jpg'],a[href$='.jpeg'],a[href$='.JPG'],a[href$='.png'],a[href$='.gif'],a[href$='.webp']").addClass("image-popup"),$(".image-popup").magnificPopup({type:"image",tLoading:"Loading image #%curr%...",gallery:{enabled:!0,navigateByImgClick:!0,preload:[0,1]},image:{tError:'Image #%curr% could not be loaded.'},removalDelay:500,mainClass:"mfp-zoom-in",callbacks:{beforeOpen:function(){this.st.image.markup=this.st.image.markup.replace("mfp-figure","mfp-figure mfp-with-anim")}},closeOnContentClick:!0,midClick:!0}),$(".page__content").find("h1, h2, h3, h4, h5, h6").each(function(){var e,t=$(this).attr("id");t&&((e=document.createElement("a")).className="header-link",e.href="#"+t,e.innerHTML='Permalink',e.title="Permalink",$(this).append(e))})});
\ No newline at end of file
diff --git a/assets/js/plugins/gumshoe.js b/assets/js/plugins/gumshoe.js
new file mode 100644
index 00000000..713b6eb3
--- /dev/null
+++ b/assets/js/plugins/gumshoe.js
@@ -0,0 +1,484 @@
+/*!
+ * gumshoejs v5.1.1
+ * A simple, framework-agnostic scrollspy script.
+ * (c) 2019 Chris Ferdinandi
+ * MIT License
+ * http://github.com/cferdinandi/gumshoe
+ */
+
+(function (root, factory) {
+ if ( typeof define === 'function' && define.amd ) {
+ define([], (function () {
+ return factory(root);
+ }));
+ } else if ( typeof exports === 'object' ) {
+ module.exports = factory(root);
+ } else {
+ root.Gumshoe = factory(root);
+ }
+})(typeof global !== 'undefined' ? global : typeof window !== 'undefined' ? window : this, (function (window) {
+
+ 'use strict';
+
+ //
+ // Defaults
+ //
+
+ var defaults = {
+
+ // Active classes
+ navClass: 'active',
+ contentClass: 'active',
+
+ // Nested navigation
+ nested: false,
+ nestedClass: 'active',
+
+ // Offset & reflow
+ offset: 0,
+ reflow: false,
+
+ // Event support
+ events: true
+
+ };
+
+
+ //
+ // Methods
+ //
+
+ /**
+ * Merge two or more objects together.
+ * @param {Object} objects The objects to merge together
+ * @returns {Object} Merged values of defaults and options
+ */
+ var extend = function () {
+ var merged = {};
+ Array.prototype.forEach.call(arguments, (function (obj) {
+ for (var key in obj) {
+ if (!obj.hasOwnProperty(key)) return;
+ merged[key] = obj[key];
+ }
+ }));
+ return merged;
+ };
+
+ /**
+ * Emit a custom event
+ * @param {String} type The event type
+ * @param {Node} elem The element to attach the event to
+ * @param {Object} detail Any details to pass along with the event
+ */
+ var emitEvent = function (type, elem, detail) {
+
+ // Make sure events are enabled
+ if (!detail.settings.events) return;
+
+ // Create a new event
+ var event = new CustomEvent(type, {
+ bubbles: true,
+ cancelable: true,
+ detail: detail
+ });
+
+ // Dispatch the event
+ elem.dispatchEvent(event);
+
+ };
+
+ /**
+ * Get an element's distance from the top of the Document.
+ * @param {Node} elem The element
+ * @return {Number} Distance from the top in pixels
+ */
+ var getOffsetTop = function (elem) {
+ var location = 0;
+ if (elem.offsetParent) {
+ while (elem) {
+ location += elem.offsetTop;
+ elem = elem.offsetParent;
+ }
+ }
+ return location >= 0 ? location : 0;
+ };
+
+ /**
+ * Sort content from first to last in the DOM
+ * @param {Array} contents The content areas
+ */
+ var sortContents = function (contents) {
+ if(contents) {
+ contents.sort((function (item1, item2) {
+ var offset1 = getOffsetTop(item1.content);
+ var offset2 = getOffsetTop(item2.content);
+ if (offset1 < offset2) return -1;
+ return 1;
+ }));
+ }
+ };
+
+ /**
+ * Get the offset to use for calculating position
+ * @param {Object} settings The settings for this instantiation
+ * @return {Float} The number of pixels to offset the calculations
+ */
+ var getOffset = function (settings) {
+
+ // if the offset is a function run it
+ if (typeof settings.offset === 'function') {
+ return parseFloat(settings.offset());
+ }
+
+ // Otherwise, return it as-is
+ return parseFloat(settings.offset);
+
+ };
+
+ /**
+ * Get the document element's height
+ * @private
+ * @returns {Number}
+ */
+ var getDocumentHeight = function () {
+ return Math.max(
+ document.body.scrollHeight, document.documentElement.scrollHeight,
+ document.body.offsetHeight, document.documentElement.offsetHeight,
+ document.body.clientHeight, document.documentElement.clientHeight
+ );
+ };
+
+ /**
+ * Determine if an element is in view
+ * @param {Node} elem The element
+ * @param {Object} settings The settings for this instantiation
+ * @param {Boolean} bottom If true, check if element is above bottom of viewport instead
+ * @return {Boolean} Returns true if element is in the viewport
+ */
+ var isInView = function (elem, settings, bottom) {
+ var bounds = elem.getBoundingClientRect();
+ var offset = getOffset(settings);
+ if (bottom) {
+ return parseInt(bounds.bottom, 10) < (window.innerHeight || document.documentElement.clientHeight);
+ }
+ return parseInt(bounds.top, 10) <= offset;
+ };
+
+ /**
+ * Check if at the bottom of the viewport
+ * @return {Boolean} If true, page is at the bottom of the viewport
+ */
+ var isAtBottom = function () {
+ if (window.innerHeight + window.pageYOffset >= getDocumentHeight()) return true;
+ return false;
+ };
+
+ /**
+ * Check if the last item should be used (even if not at the top of the page)
+ * @param {Object} item The last item
+ * @param {Object} settings The settings for this instantiation
+ * @return {Boolean} If true, use the last item
+ */
+ var useLastItem = function (item, settings) {
+ if (isAtBottom() && isInView(item.content, settings, true)) return true;
+ return false;
+ };
+
+ /**
+ * Get the active content
+ * @param {Array} contents The content areas
+ * @param {Object} settings The settings for this instantiation
+ * @return {Object} The content area and matching navigation link
+ */
+ var getActive = function (contents, settings) {
+ var last = contents[contents.length-1];
+ if (useLastItem(last, settings)) return last;
+ for (var i = contents.length - 1; i >= 0; i--) {
+ if (isInView(contents[i].content, settings)) return contents[i];
+ }
+ };
+
+ /**
+ * Deactivate parent navs in a nested navigation
+ * @param {Node} nav The starting navigation element
+ * @param {Object} settings The settings for this instantiation
+ */
+ var deactivateNested = function (nav, settings) {
+
+ // If nesting isn't activated, bail
+ if (!settings.nested) return;
+
+ // Get the parent navigation
+ var li = nav.parentNode.closest('li');
+ if (!li) return;
+
+ // Remove the active class
+ li.classList.remove(settings.nestedClass);
+
+ // Apply recursively to any parent navigation elements
+ deactivateNested(li, settings);
+
+ };
+
+ /**
+ * Deactivate a nav and content area
+ * @param {Object} items The nav item and content to deactivate
+ * @param {Object} settings The settings for this instantiation
+ */
+ var deactivate = function (items, settings) {
+
+ // Make sure their are items to deactivate
+ if (!items) return;
+
+ // Get the parent list item
+ var li = items.nav.closest('li');
+ if (!li) return;
+
+ // Remove the active class from the nav and content
+ li.classList.remove(settings.navClass);
+ items.content.classList.remove(settings.contentClass);
+
+ // Deactivate any parent navs in a nested navigation
+ deactivateNested(li, settings);
+
+ // Emit a custom event
+ emitEvent('gumshoeDeactivate', li, {
+ link: items.nav,
+ content: items.content,
+ settings: settings
+ });
+
+ };
+
+
+ /**
+ * Activate parent navs in a nested navigation
+ * @param {Node} nav The starting navigation element
+ * @param {Object} settings The settings for this instantiation
+ */
+ var activateNested = function (nav, settings) {
+
+ // If nesting isn't activated, bail
+ if (!settings.nested) return;
+
+ // Get the parent navigation
+ var li = nav.parentNode.closest('li');
+ if (!li) return;
+
+ // Add the active class
+ li.classList.add(settings.nestedClass);
+
+ // Apply recursively to any parent navigation elements
+ activateNested(li, settings);
+
+ };
+
+ /**
+ * Activate a nav and content area
+ * @param {Object} items The nav item and content to activate
+ * @param {Object} settings The settings for this instantiation
+ */
+ var activate = function (items, settings) {
+
+ // Make sure their are items to activate
+ if (!items) return;
+
+ // Get the parent list item
+ var li = items.nav.closest('li');
+ if (!li) return;
+
+ // Add the active class to the nav and content
+ li.classList.add(settings.navClass);
+ items.content.classList.add(settings.contentClass);
+
+ // Activate any parent navs in a nested navigation
+ activateNested(li, settings);
+
+ // Emit a custom event
+ emitEvent('gumshoeActivate', li, {
+ link: items.nav,
+ content: items.content,
+ settings: settings
+ });
+
+ };
+
+ /**
+ * Create the Constructor object
+ * @param {String} selector The selector to use for navigation items
+ * @param {Object} options User options and settings
+ */
+ var Constructor = function (selector, options) {
+
+ //
+ // Variables
+ //
+
+ var publicAPIs = {};
+ var navItems, contents, current, timeout, settings;
+
+
+ //
+ // Methods
+ //
+
+ /**
+ * Set variables from DOM elements
+ */
+ publicAPIs.setup = function () {
+
+ // Get all nav items
+ navItems = document.querySelectorAll(selector);
+
+ // Create contents array
+ contents = [];
+
+ // Loop through each item, get it's matching content, and push to the array
+ Array.prototype.forEach.call(navItems, (function (item) {
+
+ // Get the content for the nav item
+ var content = document.getElementById(decodeURIComponent(item.hash.substr(1)));
+ if (!content) return;
+
+ // Push to the contents array
+ contents.push({
+ nav: item,
+ content: content
+ });
+
+ }));
+
+ // Sort contents by the order they appear in the DOM
+ sortContents(contents);
+
+ };
+
+ /**
+ * Detect which content is currently active
+ */
+ publicAPIs.detect = function () {
+
+ // Get the active content
+ var active = getActive(contents, settings);
+
+ // if there's no active content, deactivate and bail
+ if (!active) {
+ if (current) {
+ deactivate(current, settings);
+ current = null;
+ }
+ return;
+ }
+
+ // If the active content is the one currently active, do nothing
+ if (current && active.content === current.content) return;
+
+ // Deactivate the current content and activate the new content
+ deactivate(current, settings);
+ activate(active, settings);
+
+ // Update the currently active content
+ current = active;
+
+ };
+
+ /**
+ * Detect the active content on scroll
+ * Debounced for performance
+ */
+ var scrollHandler = function (event) {
+
+ // If there's a timer, cancel it
+ if (timeout) {
+ window.cancelAnimationFrame(timeout);
+ }
+
+ // Setup debounce callback
+ timeout = window.requestAnimationFrame(publicAPIs.detect);
+
+ };
+
+ /**
+ * Update content sorting on resize
+ * Debounced for performance
+ */
+ var resizeHandler = function (event) {
+
+ // If there's a timer, cancel it
+ if (timeout) {
+ window.cancelAnimationFrame(timeout);
+ }
+
+ // Setup debounce callback
+ timeout = window.requestAnimationFrame((function () {
+ sortContents(contents);
+ publicAPIs.detect();
+ }));
+
+ };
+
+ /**
+ * Destroy the current instantiation
+ */
+ publicAPIs.destroy = function () {
+
+ // Undo DOM changes
+ if (current) {
+ deactivate(current, settings);
+ }
+
+ // Remove event listeners
+ window.removeEventListener('scroll', scrollHandler, false);
+ if (settings.reflow) {
+ window.removeEventListener('resize', resizeHandler, false);
+ }
+
+ // Reset variables
+ contents = null;
+ navItems = null;
+ current = null;
+ timeout = null;
+ settings = null;
+
+ };
+
+ /**
+ * Initialize the current instantiation
+ */
+ var init = function () {
+
+ // Merge user options into defaults
+ settings = extend(defaults, options || {});
+
+ // Setup variables based on the current DOM
+ publicAPIs.setup();
+
+ // Find the currently active content
+ publicAPIs.detect();
+
+ // Setup event listeners
+ window.addEventListener('scroll', scrollHandler, false);
+ if (settings.reflow) {
+ window.addEventListener('resize', resizeHandler, false);
+ }
+
+ };
+
+
+ //
+ // Initialize and return the public APIs
+ //
+
+ init();
+ return publicAPIs;
+
+ };
+
+
+ //
+ // Return the Constructor
+ //
+
+ return Constructor;
+
+}));
\ No newline at end of file
diff --git a/assets/js/plugins/jquery.ba-throttle-debounce.js b/assets/js/plugins/jquery.ba-throttle-debounce.js
new file mode 100644
index 00000000..fa30bdff
--- /dev/null
+++ b/assets/js/plugins/jquery.ba-throttle-debounce.js
@@ -0,0 +1,252 @@
+/*!
+ * jQuery throttle / debounce - v1.1 - 3/7/2010
+ * http://benalman.com/projects/jquery-throttle-debounce-plugin/
+ *
+ * Copyright (c) 2010 "Cowboy" Ben Alman
+ * Dual licensed under the MIT and GPL licenses.
+ * http://benalman.com/about/license/
+ */
+
+// Script: jQuery throttle / debounce: Sometimes, less is more!
+//
+// *Version: 1.1, Last updated: 3/7/2010*
+//
+// Project Home - http://benalman.com/projects/jquery-throttle-debounce-plugin/
+// GitHub - http://github.com/cowboy/jquery-throttle-debounce/
+// Source - http://github.com/cowboy/jquery-throttle-debounce/raw/master/jquery.ba-throttle-debounce.js
+// (Minified) - http://github.com/cowboy/jquery-throttle-debounce/raw/master/jquery.ba-throttle-debounce.min.js (0.7kb)
+//
+// About: License
+//
+// Copyright (c) 2010 "Cowboy" Ben Alman,
+// Dual licensed under the MIT and GPL licenses.
+// http://benalman.com/about/license/
+//
+// About: Examples
+//
+// These working examples, complete with fully commented code, illustrate a few
+// ways in which this plugin can be used.
+//
+// Throttle - http://benalman.com/code/projects/jquery-throttle-debounce/examples/throttle/
+// Debounce - http://benalman.com/code/projects/jquery-throttle-debounce/examples/debounce/
+//
+// About: Support and Testing
+//
+// Information about what version or versions of jQuery this plugin has been
+// tested with, what browsers it has been tested in, and where the unit tests
+// reside (so you can test it yourself).
+//
+// jQuery Versions - none, 1.3.2, 1.4.2
+// Browsers Tested - Internet Explorer 6-8, Firefox 2-3.6, Safari 3-4, Chrome 4-5, Opera 9.6-10.1.
+// Unit Tests - http://benalman.com/code/projects/jquery-throttle-debounce/unit/
+//
+// About: Release History
+//
+// 1.1 - (3/7/2010) Fixed a bug in where trailing callbacks
+// executed later than they should. Reworked a fair amount of internal
+// logic as well.
+// 1.0 - (3/6/2010) Initial release as a stand-alone project. Migrated over
+// from jquery-misc repo v0.4 to jquery-throttle repo v1.0, added the
+// no_trailing throttle parameter and debounce functionality.
+//
+// Topic: Note for non-jQuery users
+//
+// jQuery isn't actually required for this plugin, because nothing internal
+// uses any jQuery methods or properties. jQuery is just used as a namespace
+// under which these methods can exist.
+//
+// Since jQuery isn't actually required for this plugin, if jQuery doesn't exist
+// when this plugin is loaded, the method described below will be created in
+// the `Cowboy` namespace. Usage will be exactly the same, but instead of
+// $.method() or jQuery.method(), you'll need to use Cowboy.method().
+
+(function(window,undefined){
+ '$:nomunge'; // Used by YUI compressor.
+
+ // Since jQuery really isn't required for this plugin, use `jQuery` as the
+ // namespace only if it already exists, otherwise use the `Cowboy` namespace,
+ // creating it if necessary.
+ var $ = window.jQuery || window.Cowboy || ( window.Cowboy = {} ),
+
+ // Internal method reference.
+ jq_throttle;
+
+ // Method: jQuery.throttle
+ //
+ // Throttle execution of a function. Especially useful for rate limiting
+ // execution of handlers on events like resize and scroll. If you want to
+ // rate-limit execution of a function to a single time, see the
+ // method.
+ //
+ // In this visualization, | is a throttled-function call and X is the actual
+ // callback execution:
+ //
+ // > Throttled with `no_trailing` specified as false or unspecified:
+ // > ||||||||||||||||||||||||| (pause) |||||||||||||||||||||||||
+ // > X X X X X X X X X X X X
+ // >
+ // > Throttled with `no_trailing` specified as true:
+ // > ||||||||||||||||||||||||| (pause) |||||||||||||||||||||||||
+ // > X X X X X X X X X X
+ //
+ // Usage:
+ //
+ // > var throttled = jQuery.throttle( delay, [ no_trailing, ] callback );
+ // >
+ // > jQuery('selector').bind( 'someevent', throttled );
+ // > jQuery('selector').unbind( 'someevent', throttled );
+ //
+ // This also works in jQuery 1.4+:
+ //
+ // > jQuery('selector').bind( 'someevent', jQuery.throttle( delay, [ no_trailing, ] callback ) );
+ // > jQuery('selector').unbind( 'someevent', callback );
+ //
+ // Arguments:
+ //
+ // delay - (Number) A zero-or-greater delay in milliseconds. For event
+ // callbacks, values around 100 or 250 (or even higher) are most useful.
+ // no_trailing - (Boolean) Optional, defaults to false. If no_trailing is
+ // true, callback will only execute every `delay` milliseconds while the
+ // throttled-function is being called. If no_trailing is false or
+ // unspecified, callback will be executed one final time after the last
+ // throttled-function call. (After the throttled-function has not been
+ // called for `delay` milliseconds, the internal counter is reset)
+ // callback - (Function) A function to be executed after delay milliseconds.
+ // The `this` context and all arguments are passed through, as-is, to
+ // `callback` when the throttled-function is executed.
+ //
+ // Returns:
+ //
+ // (Function) A new, throttled, function.
+
+ $.throttle = jq_throttle = function( delay, no_trailing, callback, debounce_mode ) {
+ // After wrapper has stopped being called, this timeout ensures that
+ // `callback` is executed at the proper times in `throttle` and `end`
+ // debounce modes.
+ var timeout_id,
+
+ // Keep track of the last time `callback` was executed.
+ last_exec = 0;
+
+ // `no_trailing` defaults to falsy.
+ if ( typeof no_trailing !== 'boolean' ) {
+ debounce_mode = callback;
+ callback = no_trailing;
+ no_trailing = undefined;
+ }
+
+ // The `wrapper` function encapsulates all of the throttling / debouncing
+ // functionality and when executed will limit the rate at which `callback`
+ // is executed.
+ function wrapper() {
+ var that = this,
+ elapsed = +new Date() - last_exec,
+ args = arguments;
+
+ // Execute `callback` and update the `last_exec` timestamp.
+ function exec() {
+ last_exec = +new Date();
+ callback.apply( that, args );
+ };
+
+ // If `debounce_mode` is true (at_begin) this is used to clear the flag
+ // to allow future `callback` executions.
+ function clear() {
+ timeout_id = undefined;
+ };
+
+ if ( debounce_mode && !timeout_id ) {
+ // Since `wrapper` is being called for the first time and
+ // `debounce_mode` is true (at_begin), execute `callback`.
+ exec();
+ }
+
+ // Clear any existing timeout.
+ timeout_id && clearTimeout( timeout_id );
+
+ if ( debounce_mode === undefined && elapsed > delay ) {
+ // In throttle mode, if `delay` time has been exceeded, execute
+ // `callback`.
+ exec();
+
+ } else if ( no_trailing !== true ) {
+ // In trailing throttle mode, since `delay` time has not been
+ // exceeded, schedule `callback` to execute `delay` ms after most
+ // recent execution.
+ //
+ // If `debounce_mode` is true (at_begin), schedule `clear` to execute
+ // after `delay` ms.
+ //
+ // If `debounce_mode` is false (at end), schedule `callback` to
+ // execute after `delay` ms.
+ timeout_id = setTimeout( debounce_mode ? clear : exec, debounce_mode === undefined ? delay - elapsed : delay );
+ }
+ };
+
+ // Set the guid of `wrapper` function to the same of original callback, so
+ // it can be removed in jQuery 1.4+ .unbind or .die by using the original
+ // callback as a reference.
+ if ( $.guid ) {
+ wrapper.guid = callback.guid = callback.guid || $.guid++;
+ }
+
+ // Return the wrapper function.
+ return wrapper;
+ };
+
+ // Method: jQuery.debounce
+ //
+ // Debounce execution of a function. Debouncing, unlike throttling,
+ // guarantees that a function is only executed a single time, either at the
+ // very beginning of a series of calls, or at the very end. If you want to
+ // simply rate-limit execution of a function, see the
+ // method.
+ //
+ // In this visualization, | is a debounced-function call and X is the actual
+ // callback execution:
+ //
+ // > Debounced with `at_begin` specified as false or unspecified:
+ // > ||||||||||||||||||||||||| (pause) |||||||||||||||||||||||||
+ // > X X
+ // >
+ // > Debounced with `at_begin` specified as true:
+ // > ||||||||||||||||||||||||| (pause) |||||||||||||||||||||||||
+ // > X X
+ //
+ // Usage:
+ //
+ // > var debounced = jQuery.debounce( delay, [ at_begin, ] callback );
+ // >
+ // > jQuery('selector').bind( 'someevent', debounced );
+ // > jQuery('selector').unbind( 'someevent', debounced );
+ //
+ // This also works in jQuery 1.4+:
+ //
+ // > jQuery('selector').bind( 'someevent', jQuery.debounce( delay, [ at_begin, ] callback ) );
+ // > jQuery('selector').unbind( 'someevent', callback );
+ //
+ // Arguments:
+ //
+ // delay - (Number) A zero-or-greater delay in milliseconds. For event
+ // callbacks, values around 100 or 250 (or even higher) are most useful.
+ // at_begin - (Boolean) Optional, defaults to false. If at_begin is false or
+ // unspecified, callback will only be executed `delay` milliseconds after
+ // the last debounced-function call. If at_begin is true, callback will be
+ // executed only at the first debounced-function call. (After the
+ // throttled-function has not been called for `delay` milliseconds, the
+ // internal counter is reset)
+ // callback - (Function) A function to be executed after delay milliseconds.
+ // The `this` context and all arguments are passed through, as-is, to
+ // `callback` when the debounced-function is executed.
+ //
+ // Returns:
+ //
+ // (Function) A new, debounced, function.
+
+ $.debounce = function( delay, at_begin, callback ) {
+ return callback === undefined
+ ? jq_throttle( delay, at_begin, false )
+ : jq_throttle( delay, callback, at_begin !== false );
+ };
+
+})(this);
diff --git a/assets/js/plugins/jquery.fitvids.js b/assets/js/plugins/jquery.fitvids.js
new file mode 100644
index 00000000..5c2f85c9
--- /dev/null
+++ b/assets/js/plugins/jquery.fitvids.js
@@ -0,0 +1,82 @@
+/*jshint browser:true */
+/*!
+* FitVids 1.1
+*
+* Copyright 2013, Chris Coyier - http://css-tricks.com + Dave Rupert - http://daverupert.com
+* Credit to Thierry Koblentz - http://www.alistapart.com/articles/creating-intrinsic-ratios-for-video/
+* Released under the WTFPL license - http://sam.zoy.org/wtfpl/
+*
+*/
+
+;(function( $ ){
+
+ 'use strict';
+
+ $.fn.fitVids = function( options ) {
+ var settings = {
+ customSelector: null,
+ ignore: null
+ };
+
+ if(!document.getElementById('fit-vids-style')) {
+ // appendStyles: https://github.com/toddmotto/fluidvids/blob/master/dist/fluidvids.js
+ var head = document.head || document.getElementsByTagName('head')[0];
+ var css = '.fluid-width-video-wrapper{width:100%;position:relative;padding:0;}.fluid-width-video-wrapper iframe,.fluid-width-video-wrapper object,.fluid-width-video-wrapper embed {position:absolute;top:0;left:0;width:100%;height:100%;}';
+ var div = document.createElement("div");
+ div.innerHTML = '
x
';
+ head.appendChild(div.childNodes[1]);
+ }
+
+ if ( options ) {
+ $.extend( settings, options );
+ }
+
+ return this.each(function(){
+ var selectors = [
+ 'iframe[src*="player.vimeo.com"]',
+ 'iframe[src*="youtube.com"]',
+ 'iframe[src*="youtube-nocookie.com"]',
+ 'iframe[src*="kickstarter.com"][src*="video.html"]',
+ 'object',
+ 'embed'
+ ];
+
+ if (settings.customSelector) {
+ selectors.push(settings.customSelector);
+ }
+
+ var ignoreList = '.fitvidsignore';
+
+ if(settings.ignore) {
+ ignoreList = ignoreList + ', ' + settings.ignore;
+ }
+
+ var $allVideos = $(this).find(selectors.join(','));
+ $allVideos = $allVideos.not('object object'); // SwfObj conflict patch
+ $allVideos = $allVideos.not(ignoreList); // Disable FitVids on this video.
+
+ $allVideos.each(function(count){
+ var $this = $(this);
+ if($this.parents(ignoreList).length > 0) {
+ return; // Disable FitVids on this video.
+ }
+ if (this.tagName.toLowerCase() === 'embed' && $this.parent('object').length || $this.parent('.fluid-width-video-wrapper').length) { return; }
+ if ((!$this.css('height') && !$this.css('width')) && (isNaN($this.attr('height')) || isNaN($this.attr('width'))))
+ {
+ $this.attr('height', 9);
+ $this.attr('width', 16);
+ }
+ var height = ( this.tagName.toLowerCase() === 'object' || ($this.attr('height') && !isNaN(parseInt($this.attr('height'), 10))) ) ? parseInt($this.attr('height'), 10) : $this.height(),
+ width = !isNaN(parseInt($this.attr('width'), 10)) ? parseInt($this.attr('width'), 10) : $this.width(),
+ aspectRatio = height / width;
+ if(!$this.attr('id')){
+ var videoID = 'fitvid' + count;
+ $this.attr('id', videoID);
+ }
+ $this.wrap('').parent('.fluid-width-video-wrapper').css('padding-top', (aspectRatio * 100)+'%');
+ $this.removeAttr('height').removeAttr('width');
+ });
+ });
+ };
+// Works with either jQuery or Zepto
+})( window.jQuery || window.Zepto );
\ No newline at end of file
diff --git a/assets/js/plugins/jquery.greedy-navigation.js b/assets/js/plugins/jquery.greedy-navigation.js
new file mode 100644
index 00000000..d8f32378
--- /dev/null
+++ b/assets/js/plugins/jquery.greedy-navigation.js
@@ -0,0 +1,127 @@
+/*
+GreedyNav.js - http://lukejacksonn.com/actuate
+Licensed under the MIT license - http://opensource.org/licenses/MIT
+Copyright (c) 2015 Luke Jackson
+*/
+
+$(function() {
+
+ var $btn = $("nav.greedy-nav .greedy-nav__toggle");
+ var $vlinks = $("nav.greedy-nav .visible-links");
+ var $hlinks = $("nav.greedy-nav .hidden-links");
+ var $nav = $("nav.greedy-nav");
+ var $logo = $('nav.greedy-nav .site-logo');
+ var $logoImg = $('nav.greedy-nav .site-logo img');
+ var $title = $("nav.greedy-nav .site-title");
+ var $search = $('nav.greedy-nav button.search__toggle');
+
+ var numOfItems, totalSpace, closingTime, breakWidths;
+
+ // This function measures both hidden and visible links and sets the navbar breakpoints
+ // This is called the first time the script runs and everytime the "check()" function detects a change of window width that reached a different CSS width breakpoint, which affects the size of navbar Items
+ // Please note that "CSS width breakpoints" (which are only 4) !== "navbar breakpoints" (which are as many as the number of items on the navbar)
+ function measureLinks(){
+ numOfItems = 0;
+ totalSpace = 0;
+ closingTime = 1000;
+ breakWidths = [];
+
+ // Adds the width of a navItem in order to create breakpoints for the navbar
+ function addWidth(i, w) {
+ totalSpace += w;
+ numOfItems += 1;
+ breakWidths.push(totalSpace);
+ }
+
+ // Measures the width of hidden links by making a temporary clone of them and positioning under visible links
+ function hiddenWidth(obj){
+ var clone = obj.clone();
+ clone.css("visibility","hidden");
+ $vlinks.append(clone);
+ addWidth(0, clone.outerWidth());
+ clone.remove();
+ }
+ // Measure both visible and hidden links widths
+ $vlinks.children().outerWidth(addWidth);
+ $hlinks.children().each(function(){hiddenWidth($(this))});
+ }
+ // Get initial state
+ measureLinks();
+
+ var winWidth = $( window ).width();
+ // Set the last measured CSS width breakpoint: 0: <768px, 1: <1024px, 2: < 1280px, 3: >= 1280px.
+ var lastBreakpoint = winWidth < 768 ? 0 : winWidth < 1024 ? 1 : winWidth < 1280 ? 2 : 3;
+
+ var availableSpace, numOfVisibleItems, requiredSpace, timer;
+
+ function check() {
+
+ winWidth = $( window ).width();
+ // Set the current CSS width breakpoint: 0: <768px, 1: <1024px, 2: < 1280px, 3: >= 1280px.
+ var curBreakpoint = winWidth < 768 ? 0 : winWidth < 1024 ? 1 : winWidth < 1280 ? 2 : 3;
+ // If current breakpoint is different from last measured breakpoint, measureLinks again
+ if(curBreakpoint !== lastBreakpoint) measureLinks();
+ // Set the last measured CSS width breakpoint with the current breakpoint
+ lastBreakpoint = curBreakpoint;
+
+ // Get instant state
+ numOfVisibleItems = $vlinks.children().length;
+ // Decrease the width of visible elements from the nav innerWidth to find out the available space for navItems
+ availableSpace = /* nav */ $nav.innerWidth()
+ - /* logo */ ($logo.length !== 0 ? $logo.outerWidth(true) : 0)
+ - /* title */ $title.outerWidth(true)
+ - /* search */ ($search.length !== 0 ? $search.outerWidth(true) : 0)
+ - /* toggle */ (numOfVisibleItems !== breakWidths.length ? $btn.outerWidth(true) : 0);
+ requiredSpace = breakWidths[numOfVisibleItems - 1];
+
+ // There is not enought space
+ if (requiredSpace > availableSpace) {
+ $vlinks.children().last().prependTo($hlinks);
+ numOfVisibleItems -= 1;
+ check();
+ // There is more than enough space. If only one element is hidden, add the toggle width to the available space
+ } else if (availableSpace + (numOfVisibleItems === breakWidths.length - 1?$btn.outerWidth(true):0) > breakWidths[numOfVisibleItems]) {
+ $hlinks.children().first().appendTo($vlinks);
+ numOfVisibleItems += 1;
+ check();
+ }
+ // Update the button accordingly
+ $btn.attr("count", numOfItems - numOfVisibleItems);
+ if (numOfVisibleItems === numOfItems) {
+ $btn.addClass('hidden');
+ } else $btn.removeClass('hidden');
+ }
+
+ // Window listeners
+ $(window).resize(function() {
+ check();
+ });
+
+ $btn.on('click', function() {
+ $hlinks.toggleClass('hidden');
+ $(this).toggleClass('close');
+ clearTimeout(timer);
+ });
+
+ $hlinks.on('mouseleave', function() {
+ // Mouse has left, start the timer
+ timer = setTimeout(function() {
+ $hlinks.addClass('hidden');
+ }, closingTime);
+ }).on('mouseenter', function() {
+ // Mouse is back, cancel the timer
+ clearTimeout(timer);
+ })
+
+ // check if page has a logo
+ if($logoImg.length !== 0){
+ // check if logo is not loaded
+ if(!($logoImg[0].complete || $logoImg[0].naturalWidth !== 0)){
+ // if logo is not loaded wait for logo to load or fail to check
+ $logoImg.one("load error", check);
+ // if logo is already loaded just check
+ } else check();
+ // if page does not have a logo just check
+ } else check();
+
+});
diff --git a/assets/js/plugins/jquery.magnific-popup.js b/assets/js/plugins/jquery.magnific-popup.js
new file mode 100644
index 00000000..7d1d1978
--- /dev/null
+++ b/assets/js/plugins/jquery.magnific-popup.js
@@ -0,0 +1,1860 @@
+/*! Magnific Popup - v1.1.0 - 2016-02-20
+* http://dimsemenov.com/plugins/magnific-popup/
+* Copyright (c) 2016 Dmitry Semenov; */
+;(function (factory) {
+ if (typeof define === 'function' && define.amd) {
+ // AMD. Register as an anonymous module.
+ define(['jquery'], factory);
+ } else if (typeof exports === 'object') {
+ // Node/CommonJS
+ factory(require('jquery'));
+ } else {
+ // Browser globals
+ factory(window.jQuery || window.Zepto);
+ }
+ }(function($) {
+
+ /*>>core*/
+ /**
+ *
+ * Magnific Popup Core JS file
+ *
+ */
+
+
+ /**
+ * Private static constants
+ */
+ var CLOSE_EVENT = 'Close',
+ BEFORE_CLOSE_EVENT = 'BeforeClose',
+ AFTER_CLOSE_EVENT = 'AfterClose',
+ BEFORE_APPEND_EVENT = 'BeforeAppend',
+ MARKUP_PARSE_EVENT = 'MarkupParse',
+ OPEN_EVENT = 'Open',
+ CHANGE_EVENT = 'Change',
+ NS = 'mfp',
+ EVENT_NS = '.' + NS,
+ READY_CLASS = 'mfp-ready',
+ REMOVING_CLASS = 'mfp-removing',
+ PREVENT_CLOSE_CLASS = 'mfp-prevent-close';
+
+
+ /**
+ * Private vars
+ */
+ /*jshint -W079 */
+ var mfp, // As we have only one instance of MagnificPopup object, we define it locally to not to use 'this'
+ MagnificPopup = function(){},
+ _isJQ = !!(window.jQuery),
+ _prevStatus,
+ _window = $(window),
+ _document,
+ _prevContentType,
+ _wrapClasses,
+ _currPopupType;
+
+
+ /**
+ * Private functions
+ */
+ var _mfpOn = function(name, f) {
+ mfp.ev.on(NS + name + EVENT_NS, f);
+ },
+ _getEl = function(className, appendTo, html, raw) {
+ var el = document.createElement('div');
+ el.className = 'mfp-'+className;
+ if(html) {
+ el.innerHTML = html;
+ }
+ if(!raw) {
+ el = $(el);
+ if(appendTo) {
+ el.appendTo(appendTo);
+ }
+ } else if(appendTo) {
+ appendTo.appendChild(el);
+ }
+ return el;
+ },
+ _mfpTrigger = function(e, data) {
+ mfp.ev.triggerHandler(NS + e, data);
+
+ if(mfp.st.callbacks) {
+ // converts "mfpEventName" to "eventName" callback and triggers it if it's present
+ e = e.charAt(0).toLowerCase() + e.slice(1);
+ if(mfp.st.callbacks[e]) {
+ mfp.st.callbacks[e].apply(mfp, $.isArray(data) ? data : [data]);
+ }
+ }
+ },
+ _getCloseBtn = function(type) {
+ if(type !== _currPopupType || !mfp.currTemplate.closeBtn) {
+ mfp.currTemplate.closeBtn = $( mfp.st.closeMarkup.replace('%title%', mfp.st.tClose ) );
+ _currPopupType = type;
+ }
+ return mfp.currTemplate.closeBtn;
+ },
+ // Initialize Magnific Popup only when called at least once
+ _checkInstance = function() {
+ if(!$.magnificPopup.instance) {
+ /*jshint -W020 */
+ mfp = new MagnificPopup();
+ mfp.init();
+ $.magnificPopup.instance = mfp;
+ }
+ },
+ // CSS transition detection, http://stackoverflow.com/questions/7264899/detect-css-transitions-using-javascript-and-without-modernizr
+ supportsTransitions = function() {
+ var s = document.createElement('p').style, // 's' for style. better to create an element if body yet to exist
+ v = ['ms','O','Moz','Webkit']; // 'v' for vendor
+
+ if( s['transition'] !== undefined ) {
+ return true;
+ }
+
+ while( v.length ) {
+ if( v.pop() + 'Transition' in s ) {
+ return true;
+ }
+ }
+
+ return false;
+ };
+
+
+
+ /**
+ * Public functions
+ */
+ MagnificPopup.prototype = {
+
+ constructor: MagnificPopup,
+
+ /**
+ * Initializes Magnific Popup plugin.
+ * This function is triggered only once when $.fn.magnificPopup or $.magnificPopup is executed
+ */
+ init: function() {
+ var appVersion = navigator.appVersion;
+ mfp.isLowIE = mfp.isIE8 = document.all && !document.addEventListener;
+ mfp.isAndroid = (/android/gi).test(appVersion);
+ mfp.isIOS = (/iphone|ipad|ipod/gi).test(appVersion);
+ mfp.supportsTransition = supportsTransitions();
+
+ // We disable fixed positioned lightbox on devices that don't handle it nicely.
+ // If you know a better way of detecting this - let me know.
+ mfp.probablyMobile = (mfp.isAndroid || mfp.isIOS || /(Opera Mini)|Kindle|webOS|BlackBerry|(Opera Mobi)|(Windows Phone)|IEMobile/i.test(navigator.userAgent) );
+ _document = $(document);
+
+ mfp.popupsCache = {};
+ },
+
+ /**
+ * Opens popup
+ * @param data [description]
+ */
+ open: function(data) {
+
+ var i;
+
+ if(data.isObj === false) {
+ // convert jQuery collection to array to avoid conflicts later
+ mfp.items = data.items.toArray();
+
+ mfp.index = 0;
+ var items = data.items,
+ item;
+ for(i = 0; i < items.length; i++) {
+ item = items[i];
+ if(item.parsed) {
+ item = item.el[0];
+ }
+ if(item === data.el[0]) {
+ mfp.index = i;
+ break;
+ }
+ }
+ } else {
+ mfp.items = $.isArray(data.items) ? data.items : [data.items];
+ mfp.index = data.index || 0;
+ }
+
+ // if popup is already opened - we just update the content
+ if(mfp.isOpen) {
+ mfp.updateItemHTML();
+ return;
+ }
+
+ mfp.types = [];
+ _wrapClasses = '';
+ if(data.mainEl && data.mainEl.length) {
+ mfp.ev = data.mainEl.eq(0);
+ } else {
+ mfp.ev = _document;
+ }
+
+ if(data.key) {
+ if(!mfp.popupsCache[data.key]) {
+ mfp.popupsCache[data.key] = {};
+ }
+ mfp.currTemplate = mfp.popupsCache[data.key];
+ } else {
+ mfp.currTemplate = {};
+ }
+
+
+
+ mfp.st = $.extend(true, {}, $.magnificPopup.defaults, data );
+ mfp.fixedContentPos = mfp.st.fixedContentPos === 'auto' ? !mfp.probablyMobile : mfp.st.fixedContentPos;
+
+ if(mfp.st.modal) {
+ mfp.st.closeOnContentClick = false;
+ mfp.st.closeOnBgClick = false;
+ mfp.st.showCloseBtn = false;
+ mfp.st.enableEscapeKey = false;
+ }
+
+
+ // Building markup
+ // main containers are created only once
+ if(!mfp.bgOverlay) {
+
+ // Dark overlay
+ mfp.bgOverlay = _getEl('bg').on('click'+EVENT_NS, function() {
+ mfp.close();
+ });
+
+ mfp.wrap = _getEl('wrap').attr('tabindex', -1).on('click'+EVENT_NS, function(e) {
+ if(mfp._checkIfClose(e.target)) {
+ mfp.close();
+ }
+ });
+
+ mfp.container = _getEl('container', mfp.wrap);
+ }
+
+ mfp.contentContainer = _getEl('content');
+ if(mfp.st.preloader) {
+ mfp.preloader = _getEl('preloader', mfp.container, mfp.st.tLoading);
+ }
+
+
+ // Initializing modules
+ var modules = $.magnificPopup.modules;
+ for(i = 0; i < modules.length; i++) {
+ var n = modules[i];
+ n = n.charAt(0).toUpperCase() + n.slice(1);
+ mfp['init'+n].call(mfp);
+ }
+ _mfpTrigger('BeforeOpen');
+
+
+ if(mfp.st.showCloseBtn) {
+ // Close button
+ if(!mfp.st.closeBtnInside) {
+ mfp.wrap.append( _getCloseBtn() );
+ } else {
+ _mfpOn(MARKUP_PARSE_EVENT, function(e, template, values, item) {
+ values.close_replaceWith = _getCloseBtn(item.type);
+ });
+ _wrapClasses += ' mfp-close-btn-in';
+ }
+ }
+
+ if(mfp.st.alignTop) {
+ _wrapClasses += ' mfp-align-top';
+ }
+
+
+
+ if(mfp.fixedContentPos) {
+ mfp.wrap.css({
+ overflow: mfp.st.overflowY,
+ overflowX: 'hidden',
+ overflowY: mfp.st.overflowY
+ });
+ } else {
+ mfp.wrap.css({
+ top: _window.scrollTop(),
+ position: 'absolute'
+ });
+ }
+ if( mfp.st.fixedBgPos === false || (mfp.st.fixedBgPos === 'auto' && !mfp.fixedContentPos) ) {
+ mfp.bgOverlay.css({
+ height: _document.height(),
+ position: 'absolute'
+ });
+ }
+
+
+
+ if(mfp.st.enableEscapeKey) {
+ // Close on ESC key
+ _document.on('keyup' + EVENT_NS, function(e) {
+ if(e.keyCode === 27) {
+ mfp.close();
+ }
+ });
+ }
+
+ _window.on('resize' + EVENT_NS, function() {
+ mfp.updateSize();
+ });
+
+
+ if(!mfp.st.closeOnContentClick) {
+ _wrapClasses += ' mfp-auto-cursor';
+ }
+
+ if(_wrapClasses)
+ mfp.wrap.addClass(_wrapClasses);
+
+
+ // this triggers recalculation of layout, so we get it once to not to trigger twice
+ var windowHeight = mfp.wH = _window.height();
+
+
+ var windowStyles = {};
+
+ if( mfp.fixedContentPos ) {
+ if(mfp._hasScrollBar(windowHeight)){
+ var s = mfp._getScrollbarSize();
+ if(s) {
+ windowStyles.marginRight = s;
+ }
+ }
+ }
+
+ if(mfp.fixedContentPos) {
+ if(!mfp.isIE7) {
+ windowStyles.overflow = 'hidden';
+ } else {
+ // ie7 double-scroll bug
+ $('body, html').css('overflow', 'hidden');
+ }
+ }
+
+
+
+ var classesToadd = mfp.st.mainClass;
+ if(mfp.isIE7) {
+ classesToadd += ' mfp-ie7';
+ }
+ if(classesToadd) {
+ mfp._addClassToMFP( classesToadd );
+ }
+
+ // add content
+ mfp.updateItemHTML();
+
+ _mfpTrigger('BuildControls');
+
+ // remove scrollbar, add margin e.t.c
+ $('html').css(windowStyles);
+
+ // add everything to DOM
+ mfp.bgOverlay.add(mfp.wrap).prependTo( mfp.st.prependTo || $(document.body) );
+
+ // Save last focused element
+ mfp._lastFocusedEl = document.activeElement;
+
+ // Wait for next cycle to allow CSS transition
+ setTimeout(function() {
+
+ if(mfp.content) {
+ mfp._addClassToMFP(READY_CLASS);
+ mfp._setFocus();
+ } else {
+ // if content is not defined (not loaded e.t.c) we add class only for BG
+ mfp.bgOverlay.addClass(READY_CLASS);
+ }
+
+ // Trap the focus in popup
+ _document.on('focusin' + EVENT_NS, mfp._onFocusIn);
+
+ }, 16);
+
+ mfp.isOpen = true;
+ mfp.updateSize(windowHeight);
+ _mfpTrigger(OPEN_EVENT);
+
+ return data;
+ },
+
+ /**
+ * Closes the popup
+ */
+ close: function() {
+ if(!mfp.isOpen) return;
+ _mfpTrigger(BEFORE_CLOSE_EVENT);
+
+ mfp.isOpen = false;
+ // for CSS3 animation
+ if(mfp.st.removalDelay && !mfp.isLowIE && mfp.supportsTransition ) {
+ mfp._addClassToMFP(REMOVING_CLASS);
+ setTimeout(function() {
+ mfp._close();
+ }, mfp.st.removalDelay);
+ } else {
+ mfp._close();
+ }
+ },
+
+ /**
+ * Helper for close() function
+ */
+ _close: function() {
+ _mfpTrigger(CLOSE_EVENT);
+
+ var classesToRemove = REMOVING_CLASS + ' ' + READY_CLASS + ' ';
+
+ mfp.bgOverlay.detach();
+ mfp.wrap.detach();
+ mfp.container.empty();
+
+ if(mfp.st.mainClass) {
+ classesToRemove += mfp.st.mainClass + ' ';
+ }
+
+ mfp._removeClassFromMFP(classesToRemove);
+
+ if(mfp.fixedContentPos) {
+ var windowStyles = {marginRight: ''};
+ if(mfp.isIE7) {
+ $('body, html').css('overflow', '');
+ } else {
+ windowStyles.overflow = '';
+ }
+ $('html').css(windowStyles);
+ }
+
+ _document.off('keyup' + EVENT_NS + ' focusin' + EVENT_NS);
+ mfp.ev.off(EVENT_NS);
+
+ // clean up DOM elements that aren't removed
+ mfp.wrap.attr('class', 'mfp-wrap').removeAttr('style');
+ mfp.bgOverlay.attr('class', 'mfp-bg');
+ mfp.container.attr('class', 'mfp-container');
+
+ // remove close button from target element
+ if(mfp.st.showCloseBtn &&
+ (!mfp.st.closeBtnInside || mfp.currTemplate[mfp.currItem.type] === true)) {
+ if(mfp.currTemplate.closeBtn)
+ mfp.currTemplate.closeBtn.detach();
+ }
+
+
+ if(mfp.st.autoFocusLast && mfp._lastFocusedEl) {
+ $(mfp._lastFocusedEl).focus(); // put tab focus back
+ }
+ mfp.currItem = null;
+ mfp.content = null;
+ mfp.currTemplate = null;
+ mfp.prevHeight = 0;
+
+ _mfpTrigger(AFTER_CLOSE_EVENT);
+ },
+
+ updateSize: function(winHeight) {
+
+ if(mfp.isIOS) {
+ // fixes iOS nav bars https://github.com/dimsemenov/Magnific-Popup/issues/2
+ var zoomLevel = document.documentElement.clientWidth / window.innerWidth;
+ var height = window.innerHeight * zoomLevel;
+ mfp.wrap.css('height', height);
+ mfp.wH = height;
+ } else {
+ mfp.wH = winHeight || _window.height();
+ }
+ // Fixes #84: popup incorrectly positioned with position:relative on body
+ if(!mfp.fixedContentPos) {
+ mfp.wrap.css('height', mfp.wH);
+ }
+
+ _mfpTrigger('Resize');
+
+ },
+
+ /**
+ * Set content of popup based on current index
+ */
+ updateItemHTML: function() {
+ var item = mfp.items[mfp.index];
+
+ // Detach and perform modifications
+ mfp.contentContainer.detach();
+
+ if(mfp.content)
+ mfp.content.detach();
+
+ if(!item.parsed) {
+ item = mfp.parseEl( mfp.index );
+ }
+
+ var type = item.type;
+
+ _mfpTrigger('BeforeChange', [mfp.currItem ? mfp.currItem.type : '', type]);
+ // BeforeChange event works like so:
+ // _mfpOn('BeforeChange', function(e, prevType, newType) { });
+
+ mfp.currItem = item;
+
+ if(!mfp.currTemplate[type]) {
+ var markup = mfp.st[type] ? mfp.st[type].markup : false;
+
+ // allows to modify markup
+ _mfpTrigger('FirstMarkupParse', markup);
+
+ if(markup) {
+ mfp.currTemplate[type] = $(markup);
+ } else {
+ // if there is no markup found we just define that template is parsed
+ mfp.currTemplate[type] = true;
+ }
+ }
+
+ if(_prevContentType && _prevContentType !== item.type) {
+ mfp.container.removeClass('mfp-'+_prevContentType+'-holder');
+ }
+
+ var newContent = mfp['get' + type.charAt(0).toUpperCase() + type.slice(1)](item, mfp.currTemplate[type]);
+ mfp.appendContent(newContent, type);
+
+ item.preloaded = true;
+
+ _mfpTrigger(CHANGE_EVENT, item);
+ _prevContentType = item.type;
+
+ // Append container back after its content changed
+ mfp.container.prepend(mfp.contentContainer);
+
+ _mfpTrigger('AfterChange');
+ },
+
+
+ /**
+ * Set HTML content of popup
+ */
+ appendContent: function(newContent, type) {
+ mfp.content = newContent;
+
+ if(newContent) {
+ if(mfp.st.showCloseBtn && mfp.st.closeBtnInside &&
+ mfp.currTemplate[type] === true) {
+ // if there is no markup, we just append close button element inside
+ if(!mfp.content.find('.mfp-close').length) {
+ mfp.content.append(_getCloseBtn());
+ }
+ } else {
+ mfp.content = newContent;
+ }
+ } else {
+ mfp.content = '';
+ }
+
+ _mfpTrigger(BEFORE_APPEND_EVENT);
+ mfp.container.addClass('mfp-'+type+'-holder');
+
+ mfp.contentContainer.append(mfp.content);
+ },
+
+
+ /**
+ * Creates Magnific Popup data object based on given data
+ * @param {int} index Index of item to parse
+ */
+ parseEl: function(index) {
+ var item = mfp.items[index],
+ type;
+
+ if(item.tagName) {
+ item = { el: $(item) };
+ } else {
+ type = item.type;
+ item = { data: item, src: item.src };
+ }
+
+ if(item.el) {
+ var types = mfp.types;
+
+ // check for 'mfp-TYPE' class
+ for(var i = 0; i < types.length; i++) {
+ if( item.el.hasClass('mfp-'+types[i]) ) {
+ type = types[i];
+ break;
+ }
+ }
+
+ item.src = item.el.attr('data-mfp-src');
+ if(!item.src) {
+ item.src = item.el.attr('href');
+ }
+ }
+
+ item.type = type || mfp.st.type || 'inline';
+ item.index = index;
+ item.parsed = true;
+ mfp.items[index] = item;
+ _mfpTrigger('ElementParse', item);
+
+ return mfp.items[index];
+ },
+
+
+ /**
+ * Initializes single popup or a group of popups
+ */
+ addGroup: function(el, options) {
+ var eHandler = function(e) {
+ e.mfpEl = this;
+ mfp._openClick(e, el, options);
+ };
+
+ if(!options) {
+ options = {};
+ }
+
+ var eName = 'click.magnificPopup';
+ options.mainEl = el;
+
+ if(options.items) {
+ options.isObj = true;
+ el.off(eName).on(eName, eHandler);
+ } else {
+ options.isObj = false;
+ if(options.delegate) {
+ el.off(eName).on(eName, options.delegate , eHandler);
+ } else {
+ options.items = el;
+ el.off(eName).on(eName, eHandler);
+ }
+ }
+ },
+ _openClick: function(e, el, options) {
+ var midClick = options.midClick !== undefined ? options.midClick : $.magnificPopup.defaults.midClick;
+
+
+ if(!midClick && ( e.which === 2 || e.ctrlKey || e.metaKey || e.altKey || e.shiftKey ) ) {
+ return;
+ }
+
+ var disableOn = options.disableOn !== undefined ? options.disableOn : $.magnificPopup.defaults.disableOn;
+
+ if(disableOn) {
+ if($.isFunction(disableOn)) {
+ if( !disableOn.call(mfp) ) {
+ return true;
+ }
+ } else { // else it's number
+ if( _window.width() < disableOn ) {
+ return true;
+ }
+ }
+ }
+
+ if(e.type) {
+ e.preventDefault();
+
+ // This will prevent popup from closing if element is inside and popup is already opened
+ if(mfp.isOpen) {
+ e.stopPropagation();
+ }
+ }
+
+ options.el = $(e.mfpEl);
+ if(options.delegate) {
+ options.items = el.find(options.delegate);
+ }
+ mfp.open(options);
+ },
+
+
+ /**
+ * Updates text on preloader
+ */
+ updateStatus: function(status, text) {
+
+ if(mfp.preloader) {
+ if(_prevStatus !== status) {
+ mfp.container.removeClass('mfp-s-'+_prevStatus);
+ }
+
+ if(!text && status === 'loading') {
+ text = mfp.st.tLoading;
+ }
+
+ var data = {
+ status: status,
+ text: text
+ };
+ // allows to modify status
+ _mfpTrigger('UpdateStatus', data);
+
+ status = data.status;
+ text = data.text;
+
+ mfp.preloader.html(text);
+
+ mfp.preloader.find('a').on('click', function(e) {
+ e.stopImmediatePropagation();
+ });
+
+ mfp.container.addClass('mfp-s-'+status);
+ _prevStatus = status;
+ }
+ },
+
+
+ /*
+ "Private" helpers that aren't private at all
+ */
+ // Check to close popup or not
+ // "target" is an element that was clicked
+ _checkIfClose: function(target) {
+
+ if($(target).hasClass(PREVENT_CLOSE_CLASS)) {
+ return;
+ }
+
+ var closeOnContent = mfp.st.closeOnContentClick;
+ var closeOnBg = mfp.st.closeOnBgClick;
+
+ if(closeOnContent && closeOnBg) {
+ return true;
+ } else {
+
+ // We close the popup if click is on close button or on preloader. Or if there is no content.
+ if(!mfp.content || $(target).hasClass('mfp-close') || (mfp.preloader && target === mfp.preloader[0]) ) {
+ return true;
+ }
+
+ // if click is outside the content
+ if( (target !== mfp.content[0] && !$.contains(mfp.content[0], target)) ) {
+ if(closeOnBg) {
+ // last check, if the clicked element is in DOM, (in case it's removed onclick)
+ if( $.contains(document, target) ) {
+ return true;
+ }
+ }
+ } else if(closeOnContent) {
+ return true;
+ }
+
+ }
+ return false;
+ },
+ _addClassToMFP: function(cName) {
+ mfp.bgOverlay.addClass(cName);
+ mfp.wrap.addClass(cName);
+ },
+ _removeClassFromMFP: function(cName) {
+ this.bgOverlay.removeClass(cName);
+ mfp.wrap.removeClass(cName);
+ },
+ _hasScrollBar: function(winHeight) {
+ return ( (mfp.isIE7 ? _document.height() : document.body.scrollHeight) > (winHeight || _window.height()) );
+ },
+ _setFocus: function() {
+ (mfp.st.focus ? mfp.content.find(mfp.st.focus).eq(0) : mfp.wrap).focus();
+ },
+ _onFocusIn: function(e) {
+ if( e.target !== mfp.wrap[0] && !$.contains(mfp.wrap[0], e.target) ) {
+ mfp._setFocus();
+ return false;
+ }
+ },
+ _parseMarkup: function(template, values, item) {
+ var arr;
+ if(item.data) {
+ values = $.extend(item.data, values);
+ }
+ _mfpTrigger(MARKUP_PARSE_EVENT, [template, values, item] );
+
+ $.each(values, function(key, value) {
+ if(value === undefined || value === false) {
+ return true;
+ }
+ arr = key.split('_');
+ if(arr.length > 1) {
+ var el = template.find(EVENT_NS + '-'+arr[0]);
+
+ if(el.length > 0) {
+ var attr = arr[1];
+ if(attr === 'replaceWith') {
+ if(el[0] !== value[0]) {
+ el.replaceWith(value);
+ }
+ } else if(attr === 'img') {
+ if(el.is('img')) {
+ el.attr('src', value);
+ } else {
+ el.replaceWith( $('').attr('src', value).attr('class', el.attr('class')) );
+ }
+ } else {
+ el.attr(arr[1], value);
+ }
+ }
+
+ } else {
+ template.find(EVENT_NS + '-'+key).html(value);
+ }
+ });
+ },
+
+ _getScrollbarSize: function() {
+ // thx David
+ if(mfp.scrollbarSize === undefined) {
+ var scrollDiv = document.createElement("div");
+ scrollDiv.style.cssText = 'width: 99px; height: 99px; overflow: scroll; position: absolute; top: -9999px;';
+ document.body.appendChild(scrollDiv);
+ mfp.scrollbarSize = scrollDiv.offsetWidth - scrollDiv.clientWidth;
+ document.body.removeChild(scrollDiv);
+ }
+ return mfp.scrollbarSize;
+ }
+
+ }; /* MagnificPopup core prototype end */
+
+
+
+
+ /**
+ * Public static functions
+ */
+ $.magnificPopup = {
+ instance: null,
+ proto: MagnificPopup.prototype,
+ modules: [],
+
+ open: function(options, index) {
+ _checkInstance();
+
+ if(!options) {
+ options = {};
+ } else {
+ options = $.extend(true, {}, options);
+ }
+
+ options.isObj = true;
+ options.index = index || 0;
+ return this.instance.open(options);
+ },
+
+ close: function() {
+ return $.magnificPopup.instance && $.magnificPopup.instance.close();
+ },
+
+ registerModule: function(name, module) {
+ if(module.options) {
+ $.magnificPopup.defaults[name] = module.options;
+ }
+ $.extend(this.proto, module.proto);
+ this.modules.push(name);
+ },
+
+ defaults: {
+
+ // Info about options is in docs:
+ // http://dimsemenov.com/plugins/magnific-popup/documentation.html#options
+
+ disableOn: 0,
+
+ key: null,
+
+ midClick: false,
+
+ mainClass: '',
+
+ preloader: true,
+
+ focus: '', // CSS selector of input to focus after popup is opened
+
+ closeOnContentClick: false,
+
+ closeOnBgClick: true,
+
+ closeBtnInside: true,
+
+ showCloseBtn: true,
+
+ enableEscapeKey: true,
+
+ modal: false,
+
+ alignTop: false,
+
+ removalDelay: 0,
+
+ prependTo: null,
+
+ fixedContentPos: 'auto',
+
+ fixedBgPos: 'auto',
+
+ overflowY: 'auto',
+
+ closeMarkup: '',
+
+ tClose: 'Close (Esc)',
+
+ tLoading: 'Loading...',
+
+ autoFocusLast: true
+
+ }
+ };
+
+
+
+ $.fn.magnificPopup = function(options) {
+ _checkInstance();
+
+ var jqEl = $(this);
+
+ // We call some API method of first param is a string
+ if (typeof options === "string" ) {
+
+ if(options === 'open') {
+ var items,
+ itemOpts = _isJQ ? jqEl.data('magnificPopup') : jqEl[0].magnificPopup,
+ index = parseInt(arguments[1], 10) || 0;
+
+ if(itemOpts.items) {
+ items = itemOpts.items[index];
+ } else {
+ items = jqEl;
+ if(itemOpts.delegate) {
+ items = items.find(itemOpts.delegate);
+ }
+ items = items.eq( index );
+ }
+ mfp._openClick({mfpEl:items}, jqEl, itemOpts);
+ } else {
+ if(mfp.isOpen)
+ mfp[options].apply(mfp, Array.prototype.slice.call(arguments, 1));
+ }
+
+ } else {
+ // clone options obj
+ options = $.extend(true, {}, options);
+
+ /*
+ * As Zepto doesn't support .data() method for objects
+ * and it works only in normal browsers
+ * we assign "options" object directly to the DOM element. FTW!
+ */
+ if(_isJQ) {
+ jqEl.data('magnificPopup', options);
+ } else {
+ jqEl[0].magnificPopup = options;
+ }
+
+ mfp.addGroup(jqEl, options);
+
+ }
+ return jqEl;
+ };
+
+ /*>>core*/
+
+ /*>>inline*/
+
+ var INLINE_NS = 'inline',
+ _hiddenClass,
+ _inlinePlaceholder,
+ _lastInlineElement,
+ _putInlineElementsBack = function() {
+ if(_lastInlineElement) {
+ _inlinePlaceholder.after( _lastInlineElement.addClass(_hiddenClass) ).detach();
+ _lastInlineElement = null;
+ }
+ };
+
+ $.magnificPopup.registerModule(INLINE_NS, {
+ options: {
+ hiddenClass: 'hide', // will be appended with `mfp-` prefix
+ markup: '',
+ tNotFound: 'Content not found'
+ },
+ proto: {
+
+ initInline: function() {
+ mfp.types.push(INLINE_NS);
+
+ _mfpOn(CLOSE_EVENT+'.'+INLINE_NS, function() {
+ _putInlineElementsBack();
+ });
+ },
+
+ getInline: function(item, template) {
+
+ _putInlineElementsBack();
+
+ if(item.src) {
+ var inlineSt = mfp.st.inline,
+ el = $(item.src);
+
+ if(el.length) {
+
+ // If target element has parent - we replace it with placeholder and put it back after popup is closed
+ var parent = el[0].parentNode;
+ if(parent && parent.tagName) {
+ if(!_inlinePlaceholder) {
+ _hiddenClass = inlineSt.hiddenClass;
+ _inlinePlaceholder = _getEl(_hiddenClass);
+ _hiddenClass = 'mfp-'+_hiddenClass;
+ }
+ // replace target inline element with placeholder
+ _lastInlineElement = el.after(_inlinePlaceholder).detach().removeClass(_hiddenClass);
+ }
+
+ mfp.updateStatus('ready');
+ } else {
+ mfp.updateStatus('error', inlineSt.tNotFound);
+ el = $('