Skip to content

Commit

Permalink
IO AP sensing (Follow up of #219 and #218) (#220)
Browse files Browse the repository at this point in the history
* Device ID bug for APSensing fixed. Device ID is N4386B instead of C320. C320 was an arbitrary name given for the wellbore by the user

* more ap-sensing test data added. Advantage compared to existing data: .xml and .tra exported, which gives more metadata, readings of the instruments temperature sensors, and instrument computed log-ratio and attenuation.

* apsensing io to read both .xml and .tra files
added a check wheather a .tra file with identical timestamp exists in .xml directory. If yes it is used to:
- import t_by_dts, log_ratio, loss
- pt100 data (if it exists in file)

* apsensing tra file parse updated based on Barts generic parser idea - NOT tested yet

* cleaned up comments and docstrings in apsensing.py

* added test to test_datastore.py inside fuction test_read_apsensing_files
changelog updated

* AP Sensing Device name changed in README.rst

* corrected the linting errors by hatch run format in the files apsensing.py, sensornet.py and sensortran.py

* Update CHANGELOG.rst

Co-authored-by: Bart Schilperoort <[email protected]>

* Update CHANGELOG.rst

Co-authored-by: Bart Schilperoort <[email protected]>

* Update CHANGELOG.rst

Co-authored-by: Bart Schilperoort <[email protected]>

* Update src/dtscalibration/io/apsensing.py printout language fix

Co-authored-by: Bart Schilperoort <[email protected]>

* - renamed ap sensing data explanation and changed it to markdown format
- changed detection algorithm of .tra files
- added a double-ended calibration test of ap_sensing_2 data to test_datastore.py

* Bart's newly written append_to_data_vars_structure function added and tested.

* added flag to load .tra array data only when specified.

* automatic changes by hatch run format

* added to docstring of io/apsensing.py
    - load_tra_arrays flag
    - current .tra implementation is limited to in-memory reading only

* Move CI linting to Python 3.9

* Ignore weird mypy error

* Fix broken notebook

* updated docs/notebooks/A3Load_ap_sensing_files.ipynb
    - added explanation for .xml and .tra files
    - added an example of importing .xml + .tra data

---------

Co-authored-by: David Lah <[email protected]>
Co-authored-by: Bart Schilperoort <[email protected]>
Co-authored-by: Bart Schilperoort <[email protected]>
  • Loading branch information
4 people authored Sep 12, 2024
1 parent b51db54 commit 4a3aa8a
Show file tree
Hide file tree
Showing 32 changed files with 24,134 additions and 35 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@ jobs:
fail-fast: false
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.10
- name: Set up Python 3.9
uses: actions/setup-python@v3
with:
python-version: "3.10"
python-version: "3.9"
- name: Python info
shell: bash -l {0}
run: |
Expand Down
12 changes: 12 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@

Changelog
=========
3.0.4 (2024-08-30)
---

Fixed

* device ID bug for APSensing. Device ID is N4386B instead of C320. C320 was an arbitrary name given for the wellbore by the user.

Added

* more test data from AP sensing device N4386B, which do also contain their .tra log files
* AP sensing .tra support, as the reference temperature sensor data by this device in only logged in .tra and not in the .xml log files.
added functions in io/apsensing.py to read .tra files if they are in the same directory as the .xml files.

3.0.3 (2024-04-18)
---
Expand Down
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ Devices currently supported
===========================
* Silixa Ltd.: **Ultima** & **XT-DTS** .xml files *(up to version 8.1)*
* Sensornet Ltd.: **Oryx**, **Halo** & **Sentinel** .ddf files
* AP Sensing: **CP320** .xml files *(single ended only)*
* AP Sensing: **N4386B** .xml files *(single ended only)*
* SensorTran: **SensorTran 5100** .dat binary files *(single ended only)*

Documentation
Expand Down
4 changes: 2 additions & 2 deletions docs/notebooks/04Calculate_variance_Stokes.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -185,8 +185,8 @@
"import scipy\n",
"import numpy as np\n",
"\n",
"sigma = residuals.std()\n",
"mean = residuals.mean()\n",
"sigma = residuals.std().to_numpy()\n",
"mean = residuals.mean().to_numpy()\n",
"x = np.linspace(mean - 3 * sigma, mean + 3 * sigma, 100)\n",
"approximated_normal_fit = scipy.stats.norm.pdf(x, mean, sigma)\n",
"residuals.plot.hist(bins=50, figsize=(12, 8), density=True)\n",
Expand Down
164 changes: 151 additions & 13 deletions docs/notebooks/A3Load_ap_sensing_files.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,20 @@
"metadata": {},
"source": [
"# A3. Loading AP Sensing files\n",
"This example loads AP sensing files. Only single-ended files are currently supported. Just like with Silixa's devices, the AP Sensing data is in .xml files"
"This example loads AP sensing files. Only single-ended files are currently supported. \n",
"\n",
"The currently supported AP Sensing N4386B device has two data logging options to log into .xml files and .tra files. Only .xml files contain the stokes and anti-stokes intensities needed for this calibration. Unfortunately, these .xml files are scarce on metadata and do not contain the additionally connected sensors e.g. PT100 from the device. The latter are contained inside the .tra file.\n",
"\n",
"If you did not connect any additional sensors, you can use .xml files only and add your own logged temperature data to the datastore for calibration. (Hint: The .xml file export is well hidden in your AP Sensing software *DTS Configurator* and not documented in the user manual. Inside your *Configuration* turn *POSC export* on - this will export the .xml file.)\n",
"\n",
"If you want to additionally use data exported to .tra files (e.g. PT100 data) use the .tra logging make sure to enable *Auto Save Traces* in under *Program Options* and make sure *Create Multitrace files* and *Use Binary Format* are both disabled. Make sure to place the .tra files into the identical directory as the .xml files. Then they will be imported automatically with the *read_apsensing_files* commmand.\n",
"\n",
"The current implementation of .tra file parsing is limited to in-memory reading only."
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 12,
"metadata": {
"execution": {
"iopub.execute_input": "2022-04-06T08:12:29.520519Z",
Expand All @@ -36,7 +44,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 13,
"metadata": {
"execution": {
"iopub.execute_input": "2022-04-06T08:12:31.219744Z",
Expand All @@ -45,15 +53,24 @@
"shell.execute_reply": "2022-04-06T08:12:31.224123Z"
}
},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"..\\..\\tests\\data\\ap_sensing\n"
]
}
],
"source": [
"filepath = os.path.join(\"..\", \"..\", \"tests\", \"data\", \"ap_sensing\")\n",
"filepath_with_tra = os.path.join(\"..\", \"..\", \"tests\", \"data\", \"ap_sensing_2\", \"CH1_SE\")\n",
"print(filepath)"
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 14,
"metadata": {
"execution": {
"iopub.execute_input": "2022-04-06T08:12:31.254656Z",
Expand All @@ -62,7 +79,17 @@
"shell.execute_reply": "2022-04-06T08:12:31.258995Z"
}
},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"_AP Sensing_N4386B_3_20180118201727.xml\n",
"_AP Sensing_N4386B_3_20180118202957.xml\n",
"_AP Sensing_N4386B_3_20180118205357.xml\n"
]
}
],
"source": [
"filepathlist = sorted(glob.glob(os.path.join(filepath, \"*.xml\")))\n",
"filenamelist = [os.path.basename(path) for path in filepathlist]\n",
Expand All @@ -80,7 +107,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 15,
"metadata": {
"execution": {
"iopub.execute_input": "2022-04-06T08:12:31.262782Z",
Expand All @@ -89,23 +116,51 @@
"shell.execute_reply": "2022-04-06T08:12:31.692317Z"
}
},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"3 files were found, each representing a single timestep\n",
"4 recorded vars were found: LAF, TEMP, ST, AST\n",
"Recorded at 7101 points along the cable\n",
"The measurement is single ended\n",
"Reading the data from disk\n",
"3 files were found, each representing a single timestep\n",
"4 recorded vars were found: LAF, TEMP, ST, AST\n",
"Recorded at 1201 points along the cable\n",
"The measurement is single ended\n",
"Reading the data from disk\n",
".tra files exist and will be read\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"C:\\Users\\David Lah\\Documents\\dts-data-processing\\extern\\python-dts-calibration\\src\\dtscalibration\\io\\apsensing.py:480: UserWarning: Not all .xml files have a matching .tra file.\n",
" Missing are time following timestamps {'20180118202957', '20180118201727', '20180118205357'}. Not loading .tra data.\n",
" warnings.warn(msg)\n"
]
}
],
"source": [
"ds = read_apsensing_files(directory=filepath)"
"ds = read_apsensing_files(directory=filepath)\n",
"ds_with_tra = read_apsensing_files(directory=filepath_with_tra)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The object tries to gather as much metadata from the measurement files as possible (temporal and spatial coordinates, filenames, temperature probes measurements). All other configuration settings are loaded from the first files and stored as attributes of the `xarray.Dataset`.\n",
"\n",
"y\n",
"Calibration follows as usual (see the other notebooks)."
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 16,
"metadata": {
"execution": {
"iopub.execute_input": "2022-04-06T08:12:31.695872Z",
Expand All @@ -114,10 +169,93 @@
"shell.execute_reply": "2022-04-06T08:12:31.705163Z"
}
},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<xarray.Dataset> Size: 569kB\n",
"Dimensions: (x: 7101, time: 3)\n",
"Coordinates:\n",
" * x (x) float64 57kB 0.0 0.5 1.0 ... 3.549e+03 3.55e+03 3.55e+03\n",
" filename (time) <U39 468B '_AP Sensing_N4386B_3_20180118201727.xml' ...\n",
" * time (time) datetime64[ns] 24B 2018-01-18T20:17:27 ... 2018-01-1...\n",
"Data variables:\n",
" tmp (x, time) float64 170kB 12.16 11.32 12.26 ... 15.08 17.83\n",
" st (x, time) float64 170kB 1.098 1.105 ... 3.39e-18 3.409e-18\n",
" ast (x, time) float64 170kB 0.1888 0.1891 ... 4.838e-19 4.945e-19\n",
" creationDate (time) datetime64[ns] 24B 2018-01-18T20:17:27 ... 2018-01-1...\n",
"Attributes: (12/51)\n",
" wellbore:uid: ...\n",
" wellbore:name: ...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:uid: ...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:name: ...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
" ... ...\n",
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_2:columnIndex: ...\n",
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:curveId: ...\n",
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:columnIndex: ...\n",
" isDoubleEnded: ...\n",
" forwardMeasurementChannel: ...\n",
" backwardMeasurementChannel: ...\n"
]
}
],
"source": [
"print(ds)"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<xarray.DataArray 'probe1Temperature' (time: 3)> Size: 24B\n",
"array([19.60636, 19.62306, 19.62306])\n",
"Coordinates:\n",
" filename (time) <U45 540B 'CH1_SE_AP Sensing_N4386B_1_20240130141820.xml...\n",
" * time (time) datetime64[ns] 24B 2024-01-30T14:18:20 ... 2024-01-30T14...\n",
"<xarray.Dataset> Size: 97kB\n",
"Dimensions: (x: 1201, time: 3)\n",
"Coordinates:\n",
" * x (x) float64 10kB -50.0 -49.75 -49.5 ... 249.5 249.8 250.0\n",
" filename (time) <U45 540B 'CH1_SE_AP Sensing_N4386B_1_202401301...\n",
" * time (time) datetime64[ns] 24B 2024-01-30T14:18:20 ... 2024...\n",
"Data variables:\n",
" tmp (x, time) float64 29kB 22.49 22.85 23.14 ... 20.3 19.71\n",
" st (x, time) float64 29kB 1.254 1.256 ... 0.8482 0.8397\n",
" ast (x, time) float64 29kB 0.2453 0.2461 ... 0.163 0.1609\n",
" creationDate (time) datetime64[ns] 24B 2024-01-30T14:18:20 ... 2024...\n",
" probe1Temperature (time) float64 24B 19.61 19.62 19.62\n",
" probe2Temperature (time) float64 24B 50.18 50.17 50.18\n",
" probe3Temperature (time) float64 24B 18.57 18.6 18.56\n",
" probe4Temperature (time) float64 24B 18.53 18.55 18.56\n",
"Attributes: (12/51)\n",
" wellbore:uid: ...\n",
" wellbore:name: ...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:uid: ...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:name: ...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
" ... ...\n",
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_2:columnIndex: ...\n",
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:curveId: ...\n",
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:columnIndex: ...\n",
" isDoubleEnded: ...\n",
" forwardMeasurementChannel: ...\n",
" backwardMeasurementChannel: ...\n"
]
}
],
"source": [
"print(ds_with_tra.probe1Temperature)\n",
"print(ds_with_tra)"
]
}
],
"metadata": {
Expand All @@ -136,7 +274,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.11"
"version": "3.10.4"
}
},
"nbformat": 4,
Expand Down
12 changes: 7 additions & 5 deletions src/dtscalibration/dts_accessor_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -717,7 +717,9 @@ def merge_double_ended(
ds_fw.attrs["isDoubleEnded"] == "0" and ds_bw.attrs["isDoubleEnded"] == "0"
), "(one of the) input DataStores is already double ended"

ds_fw, ds_bw = merge_double_ended_times(ds_fw, ds_bw, verbose=verbose, verify_timedeltas=verify_timedeltas)
ds_fw, ds_bw = merge_double_ended_times(
ds_fw, ds_bw, verbose=verbose, verify_timedeltas=verify_timedeltas
)

ds = ds_fw.copy()
ds_bw = ds_bw.copy()
Expand All @@ -741,10 +743,10 @@ def merge_double_ended(
ds.attrs["isDoubleEnded"] = "1"
ds["userAcquisitionTimeBW"] = ("time", ds_bw["userAcquisitionTimeFW"].values)

if plot_result:
_, ax = plt.subplots()
ds["st"].isel(time=0).plot(ax=ax, label="Stokes forward")
ds["rst"].isel(time=0).plot(ax=ax, label="Stokes backward")
if plot_result: # type: ignore
_, ax = plt.subplots() # type: ignore
ds["st"].isel(time=0).plot(ax=ax, label="Stokes forward") # type: ignore
ds["rst"].isel(time=0).plot(ax=ax, label="Stokes backward") # type: ignore
ax.legend()

return ds
Expand Down
Loading

0 comments on commit 4a3aa8a

Please sign in to comment.