Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-evaluate intensity drift (%) metric #356

Open
jkim0731 opened this issue Nov 2, 2023 · 4 comments
Open

Re-evaluate intensity drift (%) metric #356

jkim0731 opened this issue Nov 2, 2023 · 4 comments
Labels
issue type: QC Issue QC issue or trend not captured by existing reporting

Comments

@jkim0731
Copy link
Collaborator

jkim0731 commented Nov 2, 2023

Describe the Issue

  • Clear intensity drift (probably due to laser instability) not captured by the metric.
  • Current intensity drift metric flags for fine experiments.

Screenshots
image
image

https://alleninstitute-my.sharepoint.com/:p:/g/personal/jinho_kim_alleninstitute_org/EeRvwo3hEB9Lro1fVVH1tJIBbYvum2CtDe1yG6SMlnAZGA?e=lmDFor

Scope

  • Intensity drift (along with low values) were seen in the recent MESO.2 before the laser change (10/23/2023).
  • Since laser change it is fine.
  • This needs to evaluated better to detect laser instability in the future (or for MESO.1).
  • False flag affects ALL the experiments.
@jkim0731 jkim0731 added the issue type: QC Issue QC issue or trend not captured by existing reporting label Nov 2, 2023
@jkim0731
Copy link
Collaborator Author

jkim0731 commented Nov 2, 2023

  • Control language tag
  • Preventive approach of checking laser stability (e.g., every 2 months, quarterly, etc., imaging fixed sample such as fluorescence beads or pollen grain)

@seanmcculloch
Copy link
Collaborator

For now, Operator's should use the "other" tag to flag the fluorescence plot. During our review of "Other" tags, we can create meaningful controlled language tags around various failures, and potentially update the intensity drift metric (or another metric) to be more comprehensive

@DowntonCrabby
Copy link
Collaborator

the % change currently checks the first portion of the timeseries to the last portion of the timeseries- so you're correct in that it won't catch any intermediate drastic changes.

Maybe we can make some automated metrics that would be able to catch this- like a sliding window , or standard deviation. What do you think @jkim0731

@samiamseid
Copy link
Collaborator

connected to following issues:
#317
#188
#67

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
issue type: QC Issue QC issue or trend not captured by existing reporting
Projects
None yet
Development

No branches or pull requests

4 participants