diff --git a/_quarto.yml b/_quarto.yml index fcc45e6..7902204 100644 --- a/_quarto.yml +++ b/_quarto.yml @@ -54,6 +54,7 @@ website: - section: Cloud-Optimized Point Clouds (COPC) contents: - copc/index.qmd + - copc/lidar-las-to-copc.ipynb - section: GeoParquet contents: - geoparquet/index.qmd diff --git a/copc/environment.yml b/copc/environment.yml new file mode 100644 index 0000000..b66210d --- /dev/null +++ b/copc/environment.yml @@ -0,0 +1,12 @@ +name: coguide-copc +channels: + - conda-forge +dependencies: + - python=3.11 + - earthaccess + - ipykernel + - jupyterlab + - matplotlib + - libgdal>=3.5 + - python-pdal=3.3.0 + - pdal=2.6.3 \ No newline at end of file diff --git a/copc/lidar-las-to-copc.ipynb b/copc/lidar-las-to-copc.ipynb new file mode 100644 index 0000000..ab5d4b7 --- /dev/null +++ b/copc/lidar-las-to-copc.ipynb @@ -0,0 +1,878 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Converting LiDAR LAS Files to Cloud-Optimized Point Clouds (COPCs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Environment\n", + "\n", + "The packages needed for this notebook can be installed with `conda` or `mamba`. Using the [`environment.yml` from this folder](./environment.yml) run:\n", + "\n", + "```bash\n", + "conda env create -f environment.yml\n", + "```\n", + "\n", + "or\n", + "\n", + "```bash\n", + "mamba env create -f environment.yml\n", + "```\n", + "\n", + "Finally, you may activate and select the kernel in the notebook (running in Jupyter)\n", + "\n", + "```bash\n", + "conda activate coguide-copc\n", + "```\n", + "\n", + "The notebook has been tested to work with the listed Conda environment." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "This tutorial will explore how to-\n", + "\n", + "1. Read a LiDAR LAS file using PDAL in Python\n", + "2. Convert the LiDAR LAS file to Cloud-Optimized Point Cloud (COPC) format\n", + "2. Validate the generated COPC file\n", + "\n", + "## About the Dataset\n", + "\n", + "We will be using the [G-LiHT Lidar Point Cloud V001](http://doi.org/10.5067/Community/GLIHT/GLLIDARPC.001) from the NASA EarthData. To access NASA EarthData in Jupyter you need to register for an [Earthdata account](https://urs.earthdata.nasa.gov/users/new).\n", + "\n", + "We will use [earthaccess](https://github.com/nsidc/earthaccess) library to set up credentials to fetch data from NASA's EarthData catalog." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/opt/homebrew/anaconda3/envs/coguide-copc/lib/python3.11/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n", + " from .autonotebook import tqdm as notebook_tqdm\n" + ] + } + ], + "source": [ + "import earthaccess\n", + "import os\n", + "import pdal" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "earthaccess.login()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Creating a Data Directory for this Tutorial\n", + "\n", + "We are creating a data directory for downloading all the required files locally. " + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "# set data directory path\n", + "data_dir = './data'\n", + "\n", + "# check if directory exists -> if directory doesn't exist, directory is created\n", + "if not os.path.exists(data_dir):\n", + " os.mkdir(data_dir)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Downloading the Dataset from EarthData\n", + "\n", + "We are using `search_data` method from the `earthaccess` module for searching the Granules from the selected collection. The `temporal` argument defines the temporal range for " + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Granules found: 72\n" + ] + } + ], + "source": [ + "# Search Granules\n", + "\n", + "las_item_results = earthaccess.search_data(\n", + " short_name=\"GLLIDARPC\",\n", + " version=\"001\",\n", + " temporal = (\"2020\"), \n", + " count=3\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[Collection: {'EntryTitle': 'G-LiHT Lidar Point Cloud V001'}\n", + " Spatial coverage: {'HorizontalSpatialDomain': {'Geometry': {'GPolygons': [{'Boundary': {'Points': [{'Longitude': -81.03452828650298, 'Latitude': 25.50220025425373}, {'Longitude': -81.01391715300757, 'Latitude': 25.50220365895999}, {'Longitude': -81.01391819492625, 'Latitude': 25.5112430715201}, {'Longitude': -81.03453087148995, 'Latitude': 25.511239665437053}, {'Longitude': -81.03452828650298, 'Latitude': 25.50220025425373}]}}]}}}\n", + " Temporal coverage: {'RangeDateTime': {'BeginningDateTime': '2020-03-11T04:00:00.000Z', 'EndingDateTime': '2020-03-12T03:59:59.000Z'}}\n", + " Size(MB): 238.623\n", + " Data: ['https://e4ftl01.cr.usgs.gov//GWELD1/COMMUNITY/GLLIDARPC.001/2020.03.11/GLLIDARPC_FL_20200311_FIA8_l0s47.las'],\n", + " Collection: {'EntryTitle': 'G-LiHT Lidar Point Cloud V001'}\n", + " Spatial coverage: {'HorizontalSpatialDomain': {'Geometry': {'GPolygons': [{'Boundary': {'Points': [{'Longitude': -81.02242648723991, 'Latitude': 25.493163090615468}, {'Longitude': -80.99410838333016, 'Latitude': 25.49316468678571}, {'Longitude': -80.99410794242846, 'Latitude': 25.502204110708817}, {'Longitude': -81.02242816553566, 'Latitude': 25.50220251389295}, {'Longitude': -81.02242648723991, 'Latitude': 25.493163090615468}]}}]}}}\n", + " Temporal coverage: {'RangeDateTime': {'BeginningDateTime': '2020-03-11T04:00:00.000Z', 'EndingDateTime': '2020-03-12T03:59:59.000Z'}}\n", + " Size(MB): 248.383\n", + " Data: ['https://e4ftl01.cr.usgs.gov//GWELD1/COMMUNITY/GLLIDARPC.001/2020.03.11/GLLIDARPC_FL_20200311_FIA8_l0s46.las'],\n", + " Collection: {'EntryTitle': 'G-LiHT Lidar Point Cloud V001'}\n", + " Spatial coverage: {'HorizontalSpatialDomain': {'Geometry': {'GPolygons': [{'Boundary': {'Points': [{'Longitude': -80.94099075054905, 'Latitude': 25.276201329530473}, {'Longitude': -80.9355627247816, 'Latitude': 25.276199059361314}, {'Longitude': -80.9355579494582, 'Latitude': 25.285238744206318}, {'Longitude': -80.94098637748567, 'Latitude': 25.285241015299494}, {'Longitude': -80.94099075054905, 'Latitude': 25.276201329530473}]}}]}}}\n", + " Temporal coverage: {'RangeDateTime': {'BeginningDateTime': '2020-03-11T04:00:00.000Z', 'EndingDateTime': '2020-03-12T03:59:59.000Z'}}\n", + " Size(MB): 91.0422\n", + " Data: ['https://e4ftl01.cr.usgs.gov//GWELD1/COMMUNITY/GLLIDARPC.001/2020.03.11/GLLIDARPC_FL_20200311_FIA8_l0s22.las']]" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "las_item_results" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's use the file with size 91.04 MB and convert it to a COPC format. " + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Getting 1 granules, approx download size: 0.09 GB\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "QUEUEING TASKS | : 100%|██████████| 1/1 [00:00<00:00, 1869.12it/s]\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "File GLLIDARPC_FL_20200311_FIA8_l0s22.las already downloaded\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "PROCESSING TASKS | : 100%|██████████| 1/1 [00:00<00:00, 16131.94it/s]\n", + "COLLECTING RESULTS | : 100%|██████████| 1/1 [00:00<00:00, 33554.43it/s]" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "data/GLLIDARPC_FL_20200311_FIA8_l0s22.las\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "\n" + ] + } + ], + "source": [ + "# Download Data - Selecting the 3rd file from the `las_item_results` list\n", + "gliht_las_file = earthaccess.download(las_item_results[2], data_dir)\n", + "las_filename = gliht_las_file[0]\n", + "print(las_filename)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A Brief Introduction to PDAL \n", + "\n", + "[PDAL](https://pdal.io/) (Point Data Abstraction Library) is a C/C++ based open-source library for processing point cloud data. Additionally, it also has a PDAL-Python wrapper to work in a Pythonic environment. \n", + "\n", + "#### Accessing and Getting Metadata Information\n", + "\n", + "PDAL CLI provides multiple [applications](https://pdal.io/en/2.7.0/apps/index.html) for processing point clouds. Also, it allows chaining of these applications for processing point clouds. Similar to `gdal info` for TIFFs, we can run `pdal info ` on the command line for getting metadata from a point cloud file without reading it in memory. " + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " \"file_size\": 95464691,\n", + " \"filename\": \"data/GLLIDARPC_FL_20200311_FIA8_l0s22.las\",\n", + " \"now\": \"2024-03-20T12:30:57-0500\",\n", + " \"pdal_version\": \"2.6.3 (git-version: Release)\",\n", + " \"reader\": \"readers.las\",\n", + " \"stats\":\n", + " {\n", + " \"bbox\":\n", + " {\n", + " \"EPSG:4326\":\n", + " {\n", + " \"bbox\":\n", + " {\n", + " \"maxx\": -80.93555795,\n", + " \"maxy\": 25.28524102,\n", + " \"maxz\": 69.99,\n", + " \"minx\": -80.94099075,\n", + " \"miny\": 25.27619906,\n", + " \"minz\": -12.54\n", + " },\n", + " \"boundary\": { \"type\": \"Polygon\", \"coordinates\": [ [ [ -80.940990750549048, 25.276201329530473, -12.54 ], [ -80.940986377485672, 25.285241015299494, -12.54 ], [ -80.9355579494582, 25.285238744206318, 69.99 ], [ -80.935562724781605, 25.276199059361314, 69.99 ], [ -80.940990750549048, 25.276201329530473, -12.54 ] ] ] }\n", + " },\n", + " \"native\":\n", + " {\n", + " \"bbox\":\n", + " {\n", + " \"maxx\": 506487.7363,\n", + " \"maxy\": 2796533.993,\n", + " \"maxz\": 69.99,\n", + " \"minx\": 505941.2263,\n", + " \"miny\": 2795533.003,\n", + " \"minz\": -12.54\n", + " },\n", + " \"boundary\": { \"type\": \"Polygon\", \"coordinates\": [ [ [ 505941.226302567636594, 2795533.003240843303502, -12.54 ], [ 505941.226302567636594, 2796533.993240843061358, -12.54 ], [ 506487.736302567645907, 2796533.993240843061358, 69.99 ], [ 506487.736302567645907, 2795533.003240843303502, 69.99 ], [ 505941.226302567636594, 2795533.003240843303502, -12.54 ] ] ] }\n", + " }\n", + " },\n", + " \"statistic\":\n", + " [\n", + " {\n", + " \"average\": 506237.8598,\n", + " \"count\": 3409439,\n", + " \"maximum\": 506487.7363,\n", + " \"minimum\": 505941.2263,\n", + " \"name\": \"X\",\n", + " \"position\": 0,\n", + " \"stddev\": 101.3857552,\n", + " \"variance\": 10279.07135\n", + " },\n", + " {\n", + " \"average\": 2795977.637,\n", + " \"count\": 3409439,\n", + " \"maximum\": 2796533.993,\n", + " \"minimum\": 2795533.003,\n", + " \"name\": \"Y\",\n", + " \"position\": 1,\n", + " \"stddev\": 274.313888,\n", + " \"variance\": 75248.10912\n", + " },\n", + " {\n", + " \"average\": 2.192797205,\n", + " \"count\": 3409439,\n", + " \"maximum\": 69.99,\n", + " \"minimum\": -12.54,\n", + " \"name\": \"Z\",\n", + " \"position\": 2,\n", + " \"stddev\": 1.788122887,\n", + " \"variance\": 3.197383461\n", + " },\n", + " {\n", + " \"average\": 30205.96606,\n", + " \"count\": 3409439,\n", + " \"maximum\": 65535,\n", + " \"minimum\": 14789,\n", + " \"name\": \"Intensity\",\n", + " \"position\": 3,\n", + " \"stddev\": 5497.346879,\n", + " \"variance\": 30220822.71\n", + " },\n", + " {\n", + " \"average\": 1.200336478,\n", + " \"count\": 3409439,\n", + " \"maximum\": 5,\n", + " \"minimum\": 1,\n", + " \"name\": \"ReturnNumber\",\n", + " \"position\": 4,\n", + " \"stddev\": 0.4267302243,\n", + " \"variance\": 0.1820986843\n", + " },\n", + " {\n", + " \"average\": 1.400683808,\n", + " \"count\": 3409439,\n", + " \"maximum\": 5,\n", + " \"minimum\": 1,\n", + " \"name\": \"NumberOfReturns\",\n", + " \"position\": 5,\n", + " \"stddev\": 0.5529822902,\n", + " \"variance\": 0.3057894133\n", + " },\n", + " {\n", + " \"average\": 0.5248757933,\n", + " \"count\": 3409439,\n", + " \"maximum\": 1,\n", + " \"minimum\": 0,\n", + " \"name\": \"ScanDirectionFlag\",\n", + " \"position\": 6,\n", + " \"stddev\": 0.4993808847,\n", + " \"variance\": 0.249381268\n", + " },\n", + " {\n", + " \"average\": 0.002420339534,\n", + " \"count\": 3409439,\n", + " \"maximum\": 1,\n", + " \"minimum\": 0,\n", + " \"name\": \"EdgeOfFlightLine\",\n", + " \"position\": 7,\n", + " \"stddev\": 0.04913738087,\n", + " \"variance\": 0.002414482199\n", + " },\n", + " {\n", + " \"average\": 1.239596602,\n", + " \"count\": 3409439,\n", + " \"maximum\": 2,\n", + " \"minimum\": 1,\n", + " \"name\": \"Classification\",\n", + " \"position\": 8,\n", + " \"stddev\": 0.4268373506,\n", + " \"variance\": 0.1821901239\n", + " },\n", + " {\n", + " \"average\": 2.569738893,\n", + " \"count\": 3409439,\n", + " \"maximum\": 33,\n", + " \"minimum\": -31,\n", + " \"name\": \"ScanAngleRank\",\n", + " \"position\": 9,\n", + " \"stddev\": 16.33805559,\n", + " \"variance\": 266.9320603\n", + " },\n", + " {\n", + " \"average\": 1.475124207,\n", + " \"count\": 3409439,\n", + " \"maximum\": 2,\n", + " \"minimum\": 1,\n", + " \"name\": \"UserData\",\n", + " \"position\": 10,\n", + " \"stddev\": 0.4993808847,\n", + " \"variance\": 0.249381268\n", + " },\n", + " {\n", + " \"average\": 192.5227238,\n", + " \"count\": 3409439,\n", + " \"maximum\": 65535,\n", + " \"minimum\": 0,\n", + " \"name\": \"PointSourceId\",\n", + " \"position\": 11,\n", + " \"stddev\": 680.5012031,\n", + " \"variance\": 463081.8874\n", + " },\n", + " {\n", + " \"average\": 310476.1839,\n", + " \"count\": 3409439,\n", + " \"maximum\": 310485.0477,\n", + " \"minimum\": 310469.0825,\n", + " \"name\": \"GpsTime\",\n", + " \"position\": 12,\n", + " \"stddev\": 4.240892261,\n", + " \"variance\": 17.98516717\n", + " },\n", + " {\n", + " \"average\": 0,\n", + " \"count\": 3409439,\n", + " \"maximum\": 0,\n", + " \"minimum\": 0,\n", + " \"name\": \"Synthetic\",\n", + " \"position\": 13,\n", + " \"stddev\": 0,\n", + " \"variance\": 0\n", + " },\n", + " {\n", + " \"average\": 0,\n", + " \"count\": 3409439,\n", + " \"maximum\": 0,\n", + " \"minimum\": 0,\n", + " \"name\": \"KeyPoint\",\n", + " \"position\": 14,\n", + " \"stddev\": 0,\n", + " \"variance\": 0\n", + " },\n", + " {\n", + " \"average\": 0,\n", + " \"count\": 3409439,\n", + " \"maximum\": 0,\n", + " \"minimum\": 0,\n", + " \"name\": \"Withheld\",\n", + " \"position\": 15,\n", + " \"stddev\": 0,\n", + " \"variance\": 0\n", + " },\n", + " {\n", + " \"average\": 0,\n", + " \"count\": 3409439,\n", + " \"maximum\": 0,\n", + " \"minimum\": 0,\n", + " \"name\": \"Overlap\",\n", + " \"position\": 16,\n", + " \"stddev\": 0,\n", + " \"variance\": 0\n", + " }\n", + " ]\n", + " }\n", + "}\n" + ] + } + ], + "source": [ + "!pdal info {las_filename}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "\n", + "#### PDAL Pipelines\n", + "\n", + "For converting the LiDAR LAS file to COPC format, we will define a [pdal pipeline](https://pdal.io/en/latest/pipeline.html). A pipeline defines data processing within pdal for reading (using [pdal readers](https://pdal.io/en/latest/stages/readers.html)), processing (using [pdal filters](https://pdal.io/en/latest/stages/filters.html)) and writing operations (using [pdal writers](https://pdal.io/en/latest/stages/writers.html)). The pipelines can also represent sequential operations and can be executed as [_stages_](https://pdal.io/en/latest/pipeline.html#stage-object).\n", + "\n", + "A pdal pipeline is defined in a JSON format either as a JSON object or a JSON array. Below is an example of a pdal pipeline taking a `.las` file as input, generating `stats` and writing it to a COPC format. \n", + "\n", + "```json\n", + "{\n", + " \"pipeline\": [\n", + " {\n", + " \"filename\":las_filename,\n", + " \"type\":\"readers.las\"\n", + " },\n", + " {\n", + " \"type\":\"filters.stats\",\n", + " },\n", + " {\n", + " \"type\":\"writers.copc\",\n", + " \"filename\":copc_filename\n", + " }\n", + "]\n", + "}\n", + "```\n", + "\n", + "This pipeline can be executed using the `pdal pipeline ` from the command line for a pipeline saved as a local `JSON` file. \n", + "\n", + "#### Programmatic Pipeline Construction\n", + "\n", + "However, here we will explore a comparatively easier and Pythonic approach to define a pipeline and execute it. This is based on the [PDAL Python extension](https://pypi.org/project/pdal/) which provides a programmatic pipeline construction approach in addition to the simple pipeline construction approach discussed above. \n", + "\n", + "This approach utilizes the `|` operator to pipe various stages together representing a pipeline. For eg., the above pipeline can be represented as -\n", + "\n", + "```python\n", + "pipeline = pdal.Reader.las(filename=las_filename) | pdal.Writer.copc(filename=copc_filename) | pdal.Filter.stats()\n", + "```\n", + "This pipeline can be executed using `pipeline.execute`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## LAS to COPC Conversion\n", + "\n", + "Now, let's dive into converting the LAS file to a COPC format based on the programmatic pipeline construction. " + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'data/GLLIDARPC_FL_20200311_FIA8_l0s22.copc.laz'" + ] + }, + "execution_count": 64, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Defining output filename. Usually, COPC files are saved as .copc.laz\n", + "copc_filename = las_filename.replace('.las', '.copc.laz')\n", + "copc_filename" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "3409439" + ] + }, + "execution_count": 65, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# pipe = stage 1 | stage 2 | stage 3\n", + "# Or, pipeline = pipeline 1 | stage 2\n", + "\n", + "# Once the pipeline is executed successfully, it prints the count of number of points\n", + "pipe = pdal.Reader.las(filename=las_filename) | pdal.Writer.copc(filename=copc_filename)\n", + "pipe.execute()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Validation\n", + "\n", + "As we can see from output of the below cell, the `.copc.laz` file is created in the destination directory." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "total 239888\n", + "-rw-r--r-- 1 26M Mar 20 11:55 GLLIDARPC_FL_20200311_FIA8_l0s22.copc.laz\n", + "-rw-r--r-- 1 91M Feb 29 11:27 GLLIDARPC_FL_20200311_FIA8_l0s22.las\n" + ] + } + ], + "source": [ + "# using -go for removing user details and h for getting memory size in MBs\n", + "!ls -goh ./data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's read the created COPC file again and check the value of `copc` flag from the [metadata](https://pdal.io/en/latest/development/metadata.html). If the generated LiDAR file is a valid COPC file, then this flag should be set to `True`." + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "valid_pipe = pdal.Reader.copc(filename=copc_filename) | pdal.Filter.stats()\n", + "valid_pipe.execute()\n", + "\n", + "# Getting value for the \"copc\" key under the metadata\n", + "# Output is True for a valid COPC\n", + "value = valid_pipe.metadata[\"metadata\"][\"readers.copc\"].get(\"copc\")\n", + "print(value)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Accessing Data\n", + "\n", + "The data values can be accessed from the executed pipeline using `valid_pipe.arrays`. The values in the arrays represent the LiDAR point cloud attributes such as `X`, `Y`, `Z`, and `Intensity`, etc." + ] + }, + { + "cell_type": "code", + "execution_count": 73, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[array([(506245.56, 2796471.44, 0.24, 40740, 1, 1, 1, 0, 2, 0, 0, 0, 0, 16.998, 1, 0, 310483.75227621, 0),\n", + " (506247.16, 2796471.58, 0.27, 35541, 2, 2, 1, 0, 2, 0, 0, 0, 0, 16.998, 1, 0, 310483.75229014, 0),\n", + " (506247.95, 2796471.65, 0.24, 17716, 2, 2, 1, 0, 2, 0, 0, 0, 0, 16.998, 1, 0, 310483.75229699, 0),\n", + " ...,\n", + " (506066.58, 2796032.75, 2.34, 31587, 1, 1, 0, 0, 1, 0, 0, 0, 0, -24. , 2, 203, 310477.36925451, 0),\n", + " (506067.37, 2796033.29, 2.52, 32876, 1, 1, 0, 0, 1, 0, 0, 0, 0, -22.998, 2, 216, 310477.37590641, 0),\n", + " (506062.6 , 2796033.27, 1.4 , 27393, 1, 1, 0, 0, 1, 0, 0, 0, 0, -24. , 2, 108, 310477.38259945, 0)],\n", + " dtype=[('X', '