Skip to content

ElvishElvis/LCA-on-the-line

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LCA-on-the-Line: Benchmarking Out of Distribution Generalization with Class Taxonomies (ICML 2024 Oral Presentation)

Motivation Figure

Quick Start

# Create a Virtual Environment
conda create --name lca python=3.9
conda activate lca
# Install Dependencies
pip install -r requirements.txt

Experiments

Correlation Experiment

1). Pre-extract model logits using the following scripts:

Alternatively, download pre-extracted logits from this link.

2). Modify the logit_folder_path in main.py to the path where the logits are stored.

3). Launch the experiment:

python main.py

Soft Labels Linear Probing Experiment

1). Pre-extract backbone model features using extract_feature_linear_probe.py. Alternatively, download pre-extracted backbone features from this link.

2). Modify DATASET_PATHS in linear_probe_runner.py accordingly with the output path from the previous step.

3). Launch experiment.

python linear_probe_runner.py

Latent hierarchy construction

Follow the instructions in create_hierarchy.py (Models' logits are required).

To use the latent hierarchy in previous experiments, follow the use_latent_hierarchy section in main.py, and update tree_list in linear_probe_runner.py.

To evaluate the correlation between ID LCA and Soft Labels quality, refer to the "Predict soft label quality with source model LCA that construct latent hierarchy" section in main.py.

Thanks!

The scripts hier.py, datasets/, and wordNet_tree.npy are adopted from Jack Valmadre's hierarchical classification repository.

  • Valmadre, Jack. "Hierarchical classification at multiple operating points." Advances in Neural Information Processing Systems 35 (2022): 18034-18045.

To-Do

  • Clean up code for readibility.
  • Make variables/paths configurable as flags.

Contact

[email protected]

Releases

No releases published

Packages

No packages published

Languages