diff --git a/apps/docs/docs/contribute/connect-data/airbyte.md b/apps/docs/docs/contribute/connect-data/airbyte.md
index 3fe0d97d5..e1ca9707d 100644
--- a/apps/docs/docs/contribute/connect-data/airbyte.md
+++ b/apps/docs/docs/contribute/connect-data/airbyte.md
@@ -1,5 +1,5 @@
---
-title: Connect via Airbyte
+title: 🏗️ Connect via Airbyte
sidebar_position: 2
---
diff --git a/apps/docs/docs/contribute/connect-data/gcs.md b/apps/docs/docs/contribute/connect-data/gcs.md
index c2832049b..335dd5ea5 100644
--- a/apps/docs/docs/contribute/connect-data/gcs.md
+++ b/apps/docs/docs/contribute/connect-data/gcs.md
@@ -1,5 +1,5 @@
---
-title: Connect via Google Cloud Storage (GCS)
+title: 🏗️ Connect via Google Cloud Storage (GCS)
sidebar_position: 4
---
diff --git a/apps/docs/docs/contribute/impact-models.md b/apps/docs/docs/contribute/impact-models.md
index 79cc10743..20e02316e 100644
--- a/apps/docs/docs/contribute/impact-models.md
+++ b/apps/docs/docs/contribute/impact-models.md
@@ -36,7 +36,7 @@ Before you begin you'll need the following on your system:
- Python Poetry >= 1.8 (see [here](https://pypi.org/project/poetry/) to install it)
- git (see [here](https://github.com/git-guides/install-git) if you don't have it installed)
- A GitHub account (see [here](https://github.com/join) to open a new account)
-- BigQuery access (see [here](../get-started/#login-to-bigquery) if you don't have it setup already)
+- BigQuery access (see [here](../get-started) if you don't have it setup already)
### Install `gcloud`
diff --git a/apps/docs/docs/contribute/index.mdx b/apps/docs/docs/contribute/index.mdx
index 0028d0b75..5cf2a5e75 100644
--- a/apps/docs/docs/contribute/index.mdx
+++ b/apps/docs/docs/contribute/index.mdx
@@ -18,37 +18,37 @@ There are a variety of ways you can contribute to OSO. This doc features some of
Work on a specific data challenge and get paid for your contributions.
Data Scientists, Analysts
diff --git a/apps/docs/docs/contribute/share-insights.md b/apps/docs/docs/contribute/share-insights.md
index 42144b8ab..2ff57cec9 100644
--- a/apps/docs/docs/contribute/share-insights.md
+++ b/apps/docs/docs/contribute/share-insights.md
@@ -19,7 +19,7 @@ Share your work analyzing and visualizing OSS data by contributing to the [Insig
We've included some starter notebooks to help data scientists get going with OSO datasets. You can find templates and community notebooks [here](https://github.com/opensource-observer/insights/blob/main/community/notebooks).
-Also check out our playbook for [doing data science](../integrate/data-science) with OSO data.
+Also check out our playbook for [doing data science](../integrate/python-notebooks) with OSO data.
If you've created a notebook that you think our community can learn from, submit a PR to the `./community/notebooks/` directory in the Insights repo. Please include markdown in your notebook to explain your work.
diff --git a/apps/docs/docs/get-started/index.mdx b/apps/docs/docs/get-started/index.mdx
index 3d5ceb921..e94dd8445 100644
--- a/apps/docs/docs/get-started/index.mdx
+++ b/apps/docs/docs/get-started/index.mdx
@@ -89,7 +89,8 @@ To explore all the OSO datasets available, see [here](https://console.cloud.goog
Now that you're set up, there are many ways to contribute to OSO and integrate the data with your application:
-- [Do Data Science](../integrate/data-science) over OSO data sets
+- [BigQuery Studio Guide](../integrate/query-data)
+- [Write Python notebooks](../integrate/python-notebooks)
- [Propose an impact model](../contribute/impact-models) to run in our data pipeline
- [Query the OSO API](../integrate/api) for metrics and impact vectors from your web app
diff --git a/apps/docs/docs/integrate/3rd-party.md b/apps/docs/docs/integrate/3rd-party.md
new file mode 100644
index 000000000..5a80cb67b
--- /dev/null
+++ b/apps/docs/docs/integrate/3rd-party.md
@@ -0,0 +1,8 @@
+---
+title: 🏗️ Connect to 3rd Party Tools
+sidebar_position: 5
+---
+
+:::warning
+Coming soon... This page is a work in progress.
+:::
diff --git a/apps/docs/docs/integrate/api.md b/apps/docs/docs/integrate/api.md
index a6619dbe2..af3475003 100644
--- a/apps/docs/docs/integrate/api.md
+++ b/apps/docs/docs/integrate/api.md
@@ -1,17 +1,14 @@
---
title: Use the GraphQL API
-sidebar_position: 1
+sidebar_position: 10
---
-:::info
-The OSO API currently only allows read-only GraphQL queries.
-This API should only be used to fetch data to integrate into a live application.
-If you need to perform data science over a large dataset, see the guides on
-[doing data science](./data-science)
-and [downloading static data](./download-data).
-:::
-
-The OSO GraphQL API serves impact metrics for OSS projects, collections, and artifacts. Access to the OSO GraphQL API is necessary for any integration with OSO datasets.
+The OSO API currently only allows read-only GraphQL queries against OSO mart models
+(e.g. impact metrics, project info).
+This API should only be used to fetch data to integrate into a live application in production.
+For data exploration, check out the guides on
+[performing queries](./query-data.md)
+and [Python notebooks](./python-notebooks.md).
## Generate an API key
diff --git a/apps/docs/docs/integrate/embed.md b/apps/docs/docs/integrate/embed.md
deleted file mode 100644
index 1f323ce85..000000000
--- a/apps/docs/docs/integrate/embed.md
+++ /dev/null
@@ -1,11 +0,0 @@
----
-title: 🏗️ Embed OSO Widgets
-sidebar_position: 4
----
-
-:::warning
-This page is a work in progress.
-:::
-
-Excited for this feature? Tell us on the
-[GitHub issue](https://github.com/opensource-observer/oso/issues/623)
diff --git a/apps/docs/docs/integrate/fork-pipeline.md b/apps/docs/docs/integrate/fork-pipeline.md
new file mode 100644
index 000000000..e09bdeb2e
--- /dev/null
+++ b/apps/docs/docs/integrate/fork-pipeline.md
@@ -0,0 +1,8 @@
+---
+title: 🏗️ Fork the Data Pipeline
+sidebar_position: 4
+---
+
+:::warning
+Coming soon... This page is a work in progress.
+:::
diff --git a/apps/docs/docs/integrate/index.md b/apps/docs/docs/integrate/index.md
new file mode 100644
index 000000000..48ddfe9f3
--- /dev/null
+++ b/apps/docs/docs/integrate/index.md
@@ -0,0 +1,16 @@
+---
+title: Get OSO Data
+sidebar_position: 0
+---
+
+Open Source Observer is a fully open data pipeline for measuring the impact of open source efforts.
+That means all source code, data, and infrastructure is publicly available for use.
+
+- [Get Started](../get-started): to setup your Google account for data access and run your first query
+- [Data Overview](./overview): for an overview of all data available
+- [BigQuery Studio Guide](./query-data): to quickly query and download any data
+- [Python notebooks](./python-notebooks): to do more in-depth data science and processing
+- [Fork the data pipeline](./fork-pipeline): to setup your own data pipeline off any OSO model
+- [Connect OSO to 3rd Party tools](./3rd-party): like Hex.tech, Tableau, and Metabase
+- [API access](./api): to integrate OSO metrics into a live production application
+- [oss-directory](./oss-directory): to leverage [oss-directory](https://github.com/opensource-observer/oss-directory) data separate from OSO
diff --git a/apps/docs/docs/integrate/index.mdx b/apps/docs/docs/integrate/index.mdx
deleted file mode 100644
index 435af9e5d..000000000
--- a/apps/docs/docs/integrate/index.mdx
+++ /dev/null
@@ -1,15 +0,0 @@
----
-title: Get OSO Data
-sidebar_position: 0
----
-
-:::info
-There are a number of ways to access OSO data. This doc features some of the most common use cases, which you can explore further via the links on the sidebar.
-:::
-
-
-- If you want to download a snapshot of the data, the easiest way is to download it directly from BigQuery. Check out our guide to [Get Started](../get-started).
-- If you are trying to connect the latest OSO metrics into a live production application, then check out our [GraphQL API](./api).
-- If you want to do data science over any dataset, check out this [guide](./data-science).
-- If you want to just download the project info from OSS directory, we have [libraries and exports](./oss-directory) that you can use.
-
diff --git a/apps/docs/docs/integrate/oss-directory.md b/apps/docs/docs/integrate/oss-directory.md
index f376fe9c1..f07db5842 100644
--- a/apps/docs/docs/integrate/oss-directory.md
+++ b/apps/docs/docs/integrate/oss-directory.md
@@ -1,6 +1,6 @@
---
title: Fetch Project Info
-sidebar_position: 3
+sidebar_position: 12
---
:::info
diff --git a/apps/docs/docs/integrate/overview/ethereum.png b/apps/docs/docs/integrate/overview/ethereum.png
new file mode 100644
index 000000000..b6c33aecd
Binary files /dev/null and b/apps/docs/docs/integrate/overview/ethereum.png differ
diff --git a/apps/docs/docs/integrate/overview/farcaster.jpg b/apps/docs/docs/integrate/overview/farcaster.jpg
new file mode 100644
index 000000000..c07038534
Binary files /dev/null and b/apps/docs/docs/integrate/overview/farcaster.jpg differ
diff --git a/apps/docs/docs/integrate/overview/gitcoin.png b/apps/docs/docs/integrate/overview/gitcoin.png
new file mode 100644
index 000000000..38272ba4d
Binary files /dev/null and b/apps/docs/docs/integrate/overview/gitcoin.png differ
diff --git a/apps/docs/docs/integrate/overview/github.png b/apps/docs/docs/integrate/overview/github.png
new file mode 100644
index 000000000..d9f7224c0
Binary files /dev/null and b/apps/docs/docs/integrate/overview/github.png differ
diff --git a/apps/docs/docs/integrate/overview/index.mdx b/apps/docs/docs/integrate/overview/index.mdx
new file mode 100644
index 000000000..078b8c707
--- /dev/null
+++ b/apps/docs/docs/integrate/overview/index.mdx
@@ -0,0 +1,267 @@
+---
+title: Data Overview
+sidebar_position: 1
+---
+
+import Button from "../../../src/components/plasmic/Button";
+import OsoLogo from "./oso-primary.png";
+import GithubLogo from "./github.png";
+import EthereumLogo from "./ethereum.png";
+import SuperchainLogo from "./superchain.png";
+import FarcasterLogo from "./farcaster.jpg";
+import LensLogo from "./lens-protocol.png";
+import GitcoinLogo from "./gitcoin.png";
+import OpenrankLogo from "./openrank.png";
+
+
+## OSO Data Pipeline
+
+
+
+
+
+Every stage of the OSO data pipeline is queryable and downloadable.
+Like most dbt-based pipelines, we split the pipeline stages into
+[staging, intermediate, and mart models](https://docs.getdbt.com/best-practices/how-we-structure/1-guide-overview).
+
+You can find the reference documentation on every data model on
+[https://models.opensource.observer/](https://models.opensource.observer/)
+
+### OSO Mart Models
+
+These are the final product from the data pipeline,
+which is served from our [API](../api).
+
+For example, you can get a list of
+[oss-directory projects](https://models.opensource.observer/#!/model/model.opensource_observer.projects_v1)
+
+```sql
+select
+ project_id,
+ project_name,
+ display_name,
+ description
+from `opensource-observer.oso.projects_v1` LIMIT 10
+```
+
+or [code metrics by project](https://models.opensource.observer/#!/model/model.opensource_observer.code_metrics_by_project_v1).
+
+```sql
+select *
+from `opensource-observer.oso.code_metrics_by_project_v1`
+where project_name = 'uniswap'
+```
+
+*Note: Unless the model name is versioned, expect that the model is unstable and should not depended on
+in a live production application.*
+
+
+### OSO Staging / Intermediate Models
+
+From source data, we produce a "universal event table", currently stored at
+[`int_events`](https://models.opensource.observer/#!/model/model.opensource_observer.int_events).
+Each event consists of an [event_type](../../how-oso-works/event)
+(e.g. a git commit or contract invocation),
+[to/from artifacts](../../how-oso-works/oss-directory/artifact),
+a timestamp, and an amount.
+
+From this event table, we aggregate events in downstream models to produce our metrics.
+For example, you may find it cheaper to run queries against
+[`int_events_daily_to_project`](https://models.opensource.observer/#!/model/model.opensource_observer.int_events_daily_to_project).
+
+```sql
+SELECT event_source, SUM(amount) FROM `opensource-observer.oso.int_events_daily_to_project`
+WHERE project_id = 'XSDgPwFuQVcj57ARcKTGrm2w80KKlqJxaBWF6jZqe7w=' AND event_type = 'CONTRACT_INVOCATION_DAILY_COUNT'
+GROUP BY project_id, event_source
+```
+
+### OSO Playground
+
+
+
+We maintain a subset of projects and events in a playground dataset for testing and development.
+All of the production models are mirrored in this environment.
+
+## Source Data
+
+### GitHub Data
+
+
+
+
+
+[Reference documentation](https://models.opensource.observer/#!/source_list/github_archive)
+
+GitHub data is predominantly provided by the incredible
+[GH Archive](https://www.gharchive.org/) project, which
+maintains a BigQuery public dataset that is refreshed every hour.
+
+For example, to count the number of issues opened, closed, and reopened on 2020/01/01:
+
+```sql
+SELECT event as issue_status, COUNT(*) as cnt FROM (
+ SELECT type, repo.name, actor.login,
+ JSON_EXTRACT(payload, '$.action') as event,
+ FROM `githubarchive.day.20200101`
+ WHERE type = 'IssuesEvent'
+)
+GROUP by issue_status;
+```
+
+The underlying GitHub data is governed by the GitHub
+[terms of service](https://docs.github.com/en/site-policy/github-terms/github-terms-of-service).
+GH Archive code and documentation are covered by the
+[MIT license](https://github.com/igrigorik/gharchive.org/blob/master/LICENSE.md).
+
+### Ethereum Data
+
+
+
+
+
+[Reference documentation](https://models.opensource.observer/#!/source_list/ethereum)
+
+The Google Cloud team maintains a public
+[Ethereum dataset](https://cloud.google.com/blog/products/data-analytics/ethereum-bigquery-public-dataset-smart-contract-analytics).
+This is backed by the [ethereum-etl](https://github.com/blockchain-etl/ethereum-etl) project.
+
+For example, to get 10 transactions from the latest block
+
+```sql
+select
+ `hash`,
+ block_number,
+ from_address,
+ to_address,
+ value,
+ gas,
+ gas_price
+from `bigquery-public-data.crypto_ethereum.transactions` as transactions
+order by block_number desc
+limit 10
+```
+
+ethereum-etl code is covered by the
+[MIT license](https://github.com/blockchain-etl/ethereum-etl/blob/develop/LICENSE).
+
+### Superchain Data
+
+
+
+
+
+OSO maintains public datasets for the Superchain,
+backed by our partners at
+[Goldsky](https://goldsky.com/).
+
+We currently have coverage for:
+- [Optimism mainnet](https://models.opensource.observer/#!/source_list/superchain)
+- [Base](https://models.opensource.observer/#!/source_list/base)
+- [Frax](https://models.opensource.observer/#!/source_list/frax)
+- [Metal](https://models.opensource.observer/#!/source_list/metal)
+- [Mode](https://models.opensource.observer/#!/source_list/mode)
+- [PGN](https://models.opensource.observer/#!/source_list/pgn)
+- [Zora](https://models.opensource.observer/#!/source_list/zora)
+
+### Farcaster Data
+
+
+
+
+
+[Reference documentation](https://models.opensource.observer/#!/source_list/farcaster)
+
+:::warning
+Coming soon...
+:::
+
+### Lens Data
+
+
+
+
+
+[Reference documentation](https://models.opensource.observer/#!/source_list/lens)
+
+:::warning
+Coming soon...
+:::
+
+### Gitcoin Passport Data
+
+
+
+
+
+[Reference documentation](https://models.opensource.observer/#!/source_list/gitcoin)
+
+:::warning
+Coming soon...
+:::
+
+
+### OpenRank Data
+
+
+
+
+
+[Reference documentation](https://models.opensource.observer/#!/source_list/karma3)
+
+:::warning
+Coming soon...
+:::
\ No newline at end of file
diff --git a/apps/docs/docs/integrate/overview/lens-protocol.png b/apps/docs/docs/integrate/overview/lens-protocol.png
new file mode 100644
index 000000000..60152720d
Binary files /dev/null and b/apps/docs/docs/integrate/overview/lens-protocol.png differ
diff --git a/apps/docs/docs/integrate/overview/openrank.png b/apps/docs/docs/integrate/overview/openrank.png
new file mode 100644
index 000000000..f627f95ed
Binary files /dev/null and b/apps/docs/docs/integrate/overview/openrank.png differ
diff --git a/apps/docs/docs/integrate/overview/oso-primary.png b/apps/docs/docs/integrate/overview/oso-primary.png
new file mode 100644
index 000000000..a5c4b2b83
Binary files /dev/null and b/apps/docs/docs/integrate/overview/oso-primary.png differ
diff --git a/apps/docs/docs/integrate/overview/superchain.png b/apps/docs/docs/integrate/overview/superchain.png
new file mode 100644
index 000000000..703ea9045
Binary files /dev/null and b/apps/docs/docs/integrate/overview/superchain.png differ
diff --git a/apps/docs/docs/integrate/data-science.md b/apps/docs/docs/integrate/python-notebooks.md
similarity index 98%
rename from apps/docs/docs/integrate/data-science.md
rename to apps/docs/docs/integrate/python-notebooks.md
index 509e51fc7..9c5e2b5a1 100644
--- a/apps/docs/docs/integrate/data-science.md
+++ b/apps/docs/docs/integrate/python-notebooks.md
@@ -1,15 +1,11 @@
---
-title: Do Data Science
-sidebar_position: 2
+title: Write Python notebooks
+sidebar_position: 3
---
-:::info
Notebooks are a great way for data scientists to explore data, organize ad-hoc analysis, and share insights. We've included several template notebooks to help you get started working with OSO data. You can find these on [Google Colab](https://drive.google.com/drive/folders/1mzqrSToxPaWhsoGOR-UVldIsaX1gqP0F?usp=drive_link) and in the [community directory](https://github.com/opensource-observer/insights/tree/main/community/notebooks) of our insights repo. We encourage you to share your analysis and visualizations with the OSO community.
-:::
-:::warning
-You will need access to the OSO data warehouse to do data science. See our getting started guide [here](../get-started/#login-to-bigquery).
-:::
+You will need access to the OSO data warehouse to do data science. See our getting started guide [here](../get-started).
## Fetching Data
@@ -23,8 +19,6 @@ The next section will walk you through each of these methods.
### With Google Colab
----
-
The fastest way to get started with data science on OSO is to copy one of our notebooks on [Google Colab](https://drive.google.com/drive/folders/1mzqrSToxPaWhsoGOR-UVldIsaX1gqP0F?usp=drive_link).
You can also create a new notebook from scratch and run it in the cloud. Here's how to get started:
@@ -91,7 +85,7 @@ You can also create a new notebook from scratch and run it in the cloud. Here's
You can execute these imports in a new code block after you've grabbed your data or back at the top of your notebook with the other imports.
-That's it! You're ready to start analyzing the OSO dataset in a Google Colab notebook. You can [skip ahead to the tutorial](./data-science#tutorial-github-stars--forks-analysis) to see an example of how to analyze the data.
+That's it! You're ready to start analyzing the OSO dataset in a Google Colab notebook. You can [skip ahead to the tutorial](#tutorial-github-stars--forks-analysis) to see an example of how to analyze the data.
:::tip
You can also download your Colab notebooks to your local machine and run them in Jupyter.
@@ -99,8 +93,6 @@ You can also download your Colab notebooks to your local machine and run them in
### Using Jupyter on your machine
----
-
This section will walk you through setting up a local Jupyter notebook environment, storing your GCP service account key on your machine, and connecting to the OSO data warehouse.
#### Install Anaconda
@@ -226,14 +218,10 @@ You can also go there directly by following [this link](https://console.cloud.go
![GCP APIs](./gcp_apis.png)
----
-
Click the **Create Credentials** button.
![GCP Credentials](./gcp_credentials.png)
----
-
You will prompted to configure your credentials:
- **Select an API**: BigQuery API
@@ -241,8 +229,6 @@ You will prompted to configure your credentials:
Click **Next**.
----
-
You will be prompted to create a service account:
- **Service account name**: Add whatever name you want (eg, playground-service-account)
@@ -251,8 +237,6 @@ You will be prompted to create a service account:
Click **Create and continue**.
----
-
You will be prompted to grant your service account access to your project.
- **Select a role**: BigQuery > BigQuery Admin
@@ -261,24 +245,18 @@ You will be prompted to grant your service account access to your project.
Click **Continue**.
----
-
You can skip the final step by clicking **Done**. Or, you may grant additional users access to your service account by adding their emails (this is not required).
You should now see the new service account under the **Credentials** screen.
![GCP Credentials Keys](./gcp_credentials_keys.png)
----
-
Click the pencil icon under **Actions** in the **Service Accounts** table.
Then navigate to the **Keys** tab and click **Add Key** > **Create new key**.
![GCP Add Key](./gcp_add_key.png)
----
-
Choose **JSON** and click **Create**.
It will download the JSON file with your private key info. You should be able to find the file in your downloads folder.
@@ -297,8 +275,6 @@ A Jupyter directory will open in your browser. Navigate to the directory where y
Click **New** > **Python 3** to open a new notebook. (Use your virtual environment if you have one.)
----
-
You should have a blank notebook open.
Import the BigQuery client library and authenticate with your service account key.
@@ -357,12 +333,10 @@ If you prefer to work with static data, you can export your data from BigQuery t
df = pd.read_csv('path/to/your/file.csv')
```
-If this is your preferred workflow, you can [skip the first part](./data-science#transform) of the next section.
+If this is your preferred workflow, you can [skip the first part](#transform) of the next section.
## Running Your Own Analysis
----
-
Once you have your local environment set up, you can fork any of the notebooks in the [community GitHub directory](https://github.com/opensource-observer/insights/tree/main/community/notebooks).
Or you can run them directly in the cloud through our [Community Colab directory](https://drive.google.com/drive/folders/1mzqrSToxPaWhsoGOR-UVldIsaX1gqP0F?usp=drive_link).
@@ -516,8 +490,6 @@ dff.to_csv('code_metrics.csv', index=False)
## Creating Impact Metrics
----
-
An **impact metric** is essentially a SQL query made against the OSO dataset that enables a user to make objective comparisons of impact among projects.
There are a variety of statistical techniques for analyzing data about impact metrics and identifying trends. This section provides a basic example of how to create an impact metric and run a distribution analysis.
@@ -550,8 +522,6 @@ If you'd like to share your impact metric analysis with the OSO community, you c
### Tutorial: analyze fork count distributions
----
-
This example will walk you through the process of normalizing the distribution for the number of forks a project has.
You can find the notebook shown in this tutorial [here](https://github.com/opensource-observer/insights/blob/main/community/notebooks/oso_impact_vector_starter.ipynb).
@@ -607,8 +577,6 @@ df = results.to_dataframe()
df.set_index('project_name', inplace=True)
```
----
-
#### Normalizing the Data
Now we have a dataframe with the latest fork counts for all projects in the OSO data warehouse. Next, we will normalize the fork column through some vector math. We will use the [z-score](https://en.wikipedia.org/wiki/Standard_score) to measure how many standard deviations a project's forks are from the mean, and the normalize the z-scores to a 0-1 scale.
@@ -641,8 +609,6 @@ We can also take a look at the relationship between absolute forks and the norma
Not all datasets will have a log normal distribution. It's important to understand the distribution of the underlying impact metric and experiment with different models before setting performance targets.
:::
----
-
#### Comparing Projects
Now that we have our distribution, we can compare projects to see how they perform relative to others in our collection. We can also set performance targets based on the normalized distribution. For example, an "exceptional" project might be in the top 5% of the distribution and an "excellent" project might be in the top 20%.
@@ -703,8 +669,6 @@ ax.grid(which='major', axis='x', color='black', lw=.5)
## Sharing Analysis
----
-
Once you have completed your analysis, you can share it with the community by submitting a PR to the [insights repo](https://github.com/opensource-observer/insights).
If you have ideas for analysis that you would like to see or receive help on, please [open an issue](https://github.com/opensource-observer/insights/issues) and tag one of the maintainers.
diff --git a/apps/docs/docs/integrate/download-data.md b/apps/docs/docs/integrate/query-data.md
similarity index 82%
rename from apps/docs/docs/integrate/download-data.md
rename to apps/docs/docs/integrate/query-data.md
index 3c18a0516..ff83763a7 100644
--- a/apps/docs/docs/integrate/download-data.md
+++ b/apps/docs/docs/integrate/query-data.md
@@ -1,11 +1,9 @@
---
-title: 🏗️ Download OSO Data
-sidebar_position: 5
+title: 🏗️ Query on BigQuery Studio
+sidebar_position: 2
---
-:::info
As part of our [open source, open data, open infrastructure](../../blog/open-source-open-data-open-infra) initiative, we are making OSO data as widely available as possible. Use this guide to download the latest data for our own data stack.
-:::
:::warning
Coming soon... This page is a work in progress.
diff --git a/apps/docs/package.json b/apps/docs/package.json
index 58c2d7f65..85ffb0552 100644
--- a/apps/docs/package.json
+++ b/apps/docs/package.json
@@ -27,7 +27,7 @@
"@docusaurus/theme-common": "3.1.1",
"@laxels/docusaurus-plugin-segment": "^1.0.6",
"@mdx-js/react": "^3.0.0",
- "@plasmicapp/react-web": "^0.2.337",
+ "@plasmicapp/react-web": "^0.2.340",
"clsx": "^2.1.0",
"prism-react-renderer": "^2.3.1",
"react": "^18.2.0",
diff --git a/apps/docs/plasmic.lock b/apps/docs/plasmic.lock
index b716894a3..2c27553b4 100644
--- a/apps/docs/plasmic.lock
+++ b/apps/docs/plasmic.lock
@@ -14,32 +14,32 @@
{
"type": "renderModule",
"assetId": "z50hW5Ihi9k5",
- "checksum": "b79d99966dd9b5ee37e2e412ff45b750"
+ "checksum": "8041afb9d5261cb9dcfdde02df90afa5"
},
{
"type": "cssRules",
"assetId": "z50hW5Ihi9k5",
- "checksum": "b79d99966dd9b5ee37e2e412ff45b750"
+ "checksum": "8041afb9d5261cb9dcfdde02df90afa5"
},
{
"type": "renderModule",
"assetId": "8u0yNVg3vXsq",
- "checksum": "5a6fde6a692a357d6ec6e9d2974a8e1f"
+ "checksum": "2d14ebf9cc6b71d89f1731028117ffc5"
},
{
"type": "cssRules",
"assetId": "8u0yNVg3vXsq",
- "checksum": "5a6fde6a692a357d6ec6e9d2974a8e1f"
+ "checksum": "2d14ebf9cc6b71d89f1731028117ffc5"
},
{
"type": "renderModule",
"assetId": "XdMR0R8lmSdz",
- "checksum": "b04a9ba1b190d268180bc7de1f67d538"
+ "checksum": "840794948d2228235831cd844aebde8f"
},
{
"type": "cssRules",
"assetId": "XdMR0R8lmSdz",
- "checksum": "b04a9ba1b190d268180bc7de1f67d538"
+ "checksum": "840794948d2228235831cd844aebde8f"
},
{
"type": "icon",
diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml
index 19468f903..4e21c4e4e 100644
--- a/pnpm-lock.yaml
+++ b/pnpm-lock.yaml
@@ -42,8 +42,8 @@ importers:
specifier: ^3.0.0
version: 3.0.0(@types/react@18.3.3)(react@18.2.0)
'@plasmicapp/react-web':
- specifier: ^0.2.337
- version: 0.2.337(@types/react@18.3.3)(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
+ specifier: ^0.2.340
+ version: 0.2.340(@types/react@18.3.3)(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
clsx:
specifier: ^2.1.0
version: 2.1.0
@@ -1423,8 +1423,8 @@ packages:
resolution: {integrity: sha512-Chk32uHMg6TnQdvw2e9IlqPpFX/6NLuK0Ys2PqLb7/gL5uFn9mXvK715FGLlOLQrcO4qIkNHkvPGktzzXexsFw==}
engines: {node: '>=6.9.0'}
- '@babel/runtime@7.24.6':
- resolution: {integrity: sha512-Ja18XcETdEl5mzzACGd+DKgaGJzPTCow7EglgwTmHdwokzDFYh/MHua6lU6DV/hjF2IaOJ4oX2nqnjG7RElKOw==}
+ '@babel/runtime@7.24.7':
+ resolution: {integrity: sha512-UwgBRMjJP+xv857DCngvqXI3Iq6J4v0wXmwc6sapg+zyhbwmQX67LUEFrkK5tbyJ30jGuG3ZvWpBiB9LCy1kWw==}
engines: {node: '>=6.9.0'}
'@babel/template@7.22.15':
@@ -2376,8 +2376,8 @@ packages:
'@types/react':
optional: true
- '@mui/private-theming@5.15.14':
- resolution: {integrity: sha512-UH0EiZckOWcxiXLX3Jbb0K7rC8mxTr9L9l6QhOZxYc4r8FHUkefltV9VDGLrzCaWh30SQiJvAEd7djX3XXY6Xw==}
+ '@mui/private-theming@5.15.20':
+ resolution: {integrity: sha512-BK8F94AIqSrnaPYXf2KAOjGZJgWfvqAVQ2gVR3EryvQFtuBnG6RwodxrCvd3B48VuMy6Wsk897+lQMUxJyk+6g==}
engines: {node: '>=12.0.0'}
peerDependencies:
'@types/react': ^17.0.0 || ^18.0.0
@@ -2470,8 +2470,8 @@ packages:
'@types/react':
optional: true
- '@mui/utils@5.15.14':
- resolution: {integrity: sha512-0lF/7Hh/ezDv5X7Pry6enMsbYyGKjADzvHyo3Qrc/SSlTsQ1VkbDMbH0m2t3OR5iIVLwMoxwM7yGd+6FCMtTFA==}
+ '@mui/utils@5.15.20':
+ resolution: {integrity: sha512-mAbYx0sovrnpAu1zHc3MDIhPqL8RPVC5W5xcO1b7PiSCJPtckIZmBkp8hefamAvUiAV8gpfMOM6Zb+eSisbI2A==}
engines: {node: '>=12.0.0'}
peerDependencies:
'@types/react': ^17.0.0 || ^18.0.0
@@ -2903,8 +2903,8 @@ packages:
peerDependencies:
react: '>=16.8.0'
- '@plasmicapp/data-sources@0.1.155':
- resolution: {integrity: sha512-jfJlpHfJyurBBQT9Iun6Sq8pP4nxHpEHUPAINbOIuRKJGDJMv4vWC1sOGCvUtFjv9ltg5H1ts6EzLbSmTGhH+Q==}
+ '@plasmicapp/data-sources@0.1.156':
+ resolution: {integrity: sha512-jb1lMRNyL/zwgBhY46PfDKiWJW9jFTSy1n8qowGEhPooYto2jf/fGkm+csWJV5pvdjGIbAz/nQKsfw9GCbgm4g==}
engines: {node: '>=10'}
peerDependencies:
react: '>=16.8.0'
@@ -2915,8 +2915,8 @@ packages:
react: '>=16.8.0'
react-dom: '>=16.8.0'
- '@plasmicapp/host@1.0.196':
- resolution: {integrity: sha512-sa4lqgqDwPwr8/UTex6DC6/7KWQwi2DoCn3oOoobAIN3lpojY9Vkl1JUahK6nxU7sSgp7I0uc2a1PaQLgFTG+Q==}
+ '@plasmicapp/host@1.0.197':
+ resolution: {integrity: sha512-aotntQbSTccCLKZqKb3k2/1JEtVvuG+CS0JIqRIibcKEdgcCAS82lvemBATSY0RZWK6r/pTMWMsabQa0TxSyQw==}
peerDependencies:
react: '>=16.8.0'
react-dom: '>=16.8.0'
@@ -2955,8 +2955,8 @@ packages:
resolution: {integrity: sha512-qk6i4YrQuiAVe88pygNZ3xcFdDqaZugkZq2eDlyd4sFIyBkhGpszPk6BRtj/2OgA5ceXEa6MBat7r8Ql1oVnTA==}
engines: {node: '>=10'}
- '@plasmicapp/loader-splits@1.0.60':
- resolution: {integrity: sha512-u2RK0NOCd6+vi63W4wl8ZCxeYlSbmEj5b0IiBlGoFRwWB7/dpko9gTvbxlCOcIeIFUd7J14PzOE3aQCoALur7Q==}
+ '@plasmicapp/loader-splits@1.0.62':
+ resolution: {integrity: sha512-+kSzevnMPc3/eEYDLyQSG6+c4iXd1nVZuMJQ43Wg5uoWIMzZb15CyQBZYWiBbR3Wb3h0bHiy+L3BMG3n7KGR5g==}
engines: {node: '>=10'}
'@plasmicapp/nextjs-app-router@1.0.11':
@@ -3003,8 +3003,8 @@ packages:
peerDependencies:
react: ^16.8.0 || ^17.0.0 || ^18.0.0
- '@plasmicapp/react-web@0.2.337':
- resolution: {integrity: sha512-u2HENdYxLgHxt9+CzGpvdCttM5khdBQUiwkEsmfvgnsf9RfVhSWrS7FvdgN5roweQMfrt/vFwljM/JaocnxPZQ==}
+ '@plasmicapp/react-web@0.2.340':
+ resolution: {integrity: sha512-E/N8obSU63iqwoyq0GLB42ZRmJ1Kh/DmuHTphfM2N+QbR3SrmYKD9IWakpR5+JMA7UsbmFKl0ZyTXsb+3fv+eA==}
peerDependencies:
react: '>=16.8.0'
react-dom: '>=16.8.0'
@@ -12995,7 +12995,7 @@ snapshots:
dependencies:
regenerator-runtime: 0.14.1
- '@babel/runtime@7.24.6':
+ '@babel/runtime@7.24.7':
dependencies:
regenerator-runtime: 0.14.1
@@ -13579,7 +13579,7 @@ snapshots:
'@docusaurus/react-loadable@5.5.2(react@18.2.0)':
dependencies:
- '@types/react': 18.3.3
+ '@types/react': 18.2.64
prop-types: 15.8.1
react: 18.2.0
@@ -15123,10 +15123,10 @@ snapshots:
'@emotion/styled': 11.11.0(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@types/react@18.2.48)(react@18.2.0)
'@types/react': 18.2.48
- '@mui/private-theming@5.15.14(@types/react@18.2.48)(react@18.2.0)':
+ '@mui/private-theming@5.15.20(@types/react@18.2.48)(react@18.2.0)':
dependencies:
- '@babel/runtime': 7.24.6
- '@mui/utils': 5.15.14(@types/react@18.2.48)(react@18.2.0)
+ '@babel/runtime': 7.24.7
+ '@mui/utils': 5.15.20(@types/react@18.2.48)(react@18.2.0)
prop-types: 15.8.1
react: 18.2.0
optionalDependencies:
@@ -15143,7 +15143,7 @@ snapshots:
'@mui/styled-engine@5.15.14(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@emotion/styled@11.11.0(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@types/react@18.2.48)(react@18.2.0))(react@18.2.0)':
dependencies:
- '@babel/runtime': 7.24.6
+ '@babel/runtime': 7.24.7
'@emotion/cache': 11.11.0
csstype: 3.1.3
prop-types: 15.8.1
@@ -15165,11 +15165,11 @@ snapshots:
'@mui/system@5.15.15(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@emotion/styled@11.11.0(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@types/react@18.2.48)(react@18.2.0))(@types/react@18.2.48)(react@18.2.0)':
dependencies:
- '@babel/runtime': 7.24.6
- '@mui/private-theming': 5.15.14(@types/react@18.2.48)(react@18.2.0)
+ '@babel/runtime': 7.24.7
+ '@mui/private-theming': 5.15.20(@types/react@18.2.48)(react@18.2.0)
'@mui/styled-engine': 5.15.14(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@emotion/styled@11.11.0(@emotion/react@11.11.3(@types/react@18.2.48)(react@18.2.0))(@types/react@18.2.48)(react@18.2.0))(react@18.2.0)
'@mui/types': 7.2.14(@types/react@18.2.48)
- '@mui/utils': 5.15.14(@types/react@18.2.48)(react@18.2.0)
+ '@mui/utils': 5.15.20(@types/react@18.2.48)(react@18.2.0)
clsx: 2.1.1
csstype: 3.1.3
prop-types: 15.8.1
@@ -15203,9 +15203,9 @@ snapshots:
optionalDependencies:
'@types/react': 18.2.48
- '@mui/utils@5.15.14(@types/react@18.2.48)(react@18.2.0)':
+ '@mui/utils@5.15.20(@types/react@18.2.48)(react@18.2.0)':
dependencies:
- '@babel/runtime': 7.24.6
+ '@babel/runtime': 7.24.7
'@types/prop-types': 15.7.12
prop-types: 15.8.1
react: 18.2.0
@@ -15710,10 +15710,10 @@ snapshots:
dependencies:
react: 18.2.0
- '@plasmicapp/data-sources@0.1.155(react-dom@18.2.0(react@18.2.0))(react@18.2.0)':
+ '@plasmicapp/data-sources@0.1.156(react-dom@18.2.0(react@18.2.0))(react@18.2.0)':
dependencies:
'@plasmicapp/data-sources-context': 0.1.21(react@18.2.0)
- '@plasmicapp/host': 1.0.196(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
+ '@plasmicapp/host': 1.0.197(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
'@plasmicapp/isomorphic-unfetch': 1.0.3
'@plasmicapp/query': 0.1.79(react@18.2.0)
fast-stringify: 2.0.0
@@ -15729,7 +15729,7 @@ snapshots:
react-dom: 18.2.0(react@18.2.0)
window-or-global: 1.0.1
- '@plasmicapp/host@1.0.196(react-dom@18.2.0(react@18.2.0))(react@18.2.0)':
+ '@plasmicapp/host@1.0.197(react-dom@18.2.0(react@18.2.0))(react@18.2.0)':
dependencies:
'@plasmicapp/query': 0.1.79(react@18.2.0)
csstype: 3.1.3
@@ -15787,7 +15787,7 @@ snapshots:
dependencies:
json-logic-js: 2.0.2
- '@plasmicapp/loader-splits@1.0.60':
+ '@plasmicapp/loader-splits@1.0.62':
dependencies:
json-logic-js: 2.0.2
@@ -15834,13 +15834,13 @@ snapshots:
dependencies:
react: 18.2.0
- '@plasmicapp/react-web@0.2.337(@types/react@18.3.3)(react-dom@18.2.0(react@18.2.0))(react@18.2.0)':
+ '@plasmicapp/react-web@0.2.340(@types/react@18.3.3)(react-dom@18.2.0(react@18.2.0))(react@18.2.0)':
dependencies:
'@plasmicapp/auth-react': 0.0.21(react@18.2.0)
- '@plasmicapp/data-sources': 0.1.155(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
+ '@plasmicapp/data-sources': 0.1.156(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
'@plasmicapp/data-sources-context': 0.1.21(react@18.2.0)
- '@plasmicapp/host': 1.0.196(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
- '@plasmicapp/loader-splits': 1.0.60
+ '@plasmicapp/host': 1.0.197(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
+ '@plasmicapp/loader-splits': 1.0.62
'@plasmicapp/nextjs-app-router': 1.0.11(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
'@plasmicapp/prepass': 1.0.17(react-dom@18.2.0(react@18.2.0))(react@18.2.0)
'@plasmicapp/query': 0.1.79(react@18.2.0)
@@ -17562,7 +17562,7 @@ snapshots:
'@types/react-router@5.1.20':
dependencies:
'@types/history': 4.7.11
- '@types/react': 18.3.3
+ '@types/react': 18.2.64
'@types/react-transition-group@4.4.10':
dependencies: