You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For a new repository, the crawling process is pretty fine so far.
For already existing repository we need to run a bit different process.
The following assumptions are taken into consideration regarding the versioning and location of releases.
A list of release could be found in:
a) github releases
b) npm releases
c) ask the user to select a list of tags/release
For now, we start with c) and assume that a tag is the same as a release.
Preconditions:
We can only crawl a branch of the target repo. No PR.
TODOS:
Fetch all tags execSync('git tag').toString().trim().split('\n')
Show all tags as multi-choice ordered by date
User selection step
Crawl all selected branches
detect already crawled deprecations from older branches by the ruid (same ruid means already crawled earlier but not persisted)
💡 Optionally crawl only the diff from 2 tags so we don't recrawl already crawled deprecations
Persist the collected deprecations to the repo (master)
persist git tag and data to the deprecation object itself
The text was updated successfully, but these errors were encountered:
BioPhoton
changed the title
Crawler: How to create the deprecation hostory of an already existing project
Crawler: How to create the deprecation history of an already existing project
Aug 6, 2020
For a new repository, the crawling process is pretty fine so far.
For already existing repository we need to run a bit different process.
The following assumptions are taken into consideration regarding the versioning and location of releases.
A list of release could be found in:
a) github releases
b) npm releases
c) ask the user to select a list of tags/release
For now, we start with c) and assume that a tag is the same as a release.
Preconditions:
TODOS:
execSync('git tag').toString().trim().split('\n')
The text was updated successfully, but these errors were encountered: