Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fundamentals metadata #596

Open
wants to merge 19 commits into
base: master
Choose a base branch
from
Open

Conversation

RJTK
Copy link

@RJTK RJTK commented Feb 25, 2021

Hello, I created a method to query the fundamentals/metadata endpoint. I'm not well versed in using some of the development tools, so some stuff is still missing: I have not yet been able to get tox and pyenv to run with python versions below 3.5. The formatting check also fails.

Can I suggest that support for python 2.6 and 2.7 is dropped? It seems a lot of extra hassle to test against these ancient versions.

Perhaps this method should also just replace or be combined with get_ticker_metadata? However, they are two separate endpoints.

Summary of Changes

  • What is the motivation / context for this change
    • There is a bunch of metadata available at the url tiingo/fundamentals/metadata, the api should have a method to query it
  • Is this a bugfix or a feature?
    • It is a feature
  • What does this change / fix? (Link to Github Issue page if applicable)
    • I added the method get_fundamentals_meta.
    • I cleaned up a bunch of repeated code by adding _format_response
    • I cache the list of available tickers in _ticker_list so that all the list_tickers functions don't collect the zip file multiple times.

Checklist

  • Code follows the style guidelines of this project
    • I'm not sure what the style guidelines are. I usually format with Black, but tried to follow some of the patterns that were apparent.
  • Added tests for new functionality
  • Update HISTORY.rst with an entry summarizing your change
  • Tag a project maintainer

@lgtm-com
Copy link

lgtm-com bot commented Feb 25, 2021

This pull request introduces 1 alert when merging e5fb0ea into 4cfca69 - view on LGTM.com

new alerts:

  • 1 for Unused local variable

@hydrosquall
Copy link
Owner

Hey @RJTK ,

Thanks so much for this contribution. I think you're right that 2.6 and 2.7 are causing issues and it's not worth the maintenance burden given that 2 was sunset at the end of last year.

I also use black on new projects, but started this project before I knew that existed. Once we are fully into python 3, I think it would be good to both do a 1 time pass over the whole codebase with black, and then add it to CI.

I think this will also let us simplify the parts of the code that check for whether users are running python 2 or 3.

Let me open up an issue to sunset Python 2, with some of the checklist items that are needed for us to switch - I'd be happy to review any PRs you have time for to help with that effort, otherwise I'll try to look into this more on the weekend.

@hydrosquall
Copy link
Owner

Opened #601 with a shortlist of things needed to phase out python 2.

@RJTK
Copy link
Author

RJTK commented Mar 2, 2021

Cool thanks -- I will look at #601 and see if I can address any of the points. I am not well versed in these integration tools, but would probably be well advised to learn.

@hydrosquall
Copy link
Owner

@RJTK I did some cleanup this weekend, can you try rebasing your branch against the latest master? The CI should be better than before now, and we've dropped support below python 3.6.

Perhaps this method should also just replace or be combined with get_ticker_metadata? However, they are two separate endpoints.

My concern is that the two endpoints may not have identical signatures, and we could end up with parameters in the function that are only meant to be used by one or the other. For the moment, I think it's OK for us to keep the pattern of roughly 1 endpoint per function for these metadata calls to keep things interpretable, even though the get_ticker_price is not really following this rule (since it uses a different endpoint for intraday data).

I cleaned up a bunch of repeated code by adding _format_response

Brilliant, thank you!

I cache the list of available tickers in _ticker_list so that all the list_tickers functions don't collect the zip file multiple times.

Ah, thanks for looking into this optimization. What are your thoughts on what a user should do if they want to invalidate the cached results?

@RJTK
Copy link
Author

RJTK commented Mar 8, 2021

roughly 1 endpoint per function

This makes sense to me.

Ah, thanks for looking into this optimization. What are your thoughts on what a user should do if they want to invalidate the cached results?

This didn't occur to me -- I added an option to force it to refresh the cache and updated some comments to make it clear that the results were cached. We could also just get rid of this caching -- my assumption was that it wouldn't be asking the website for the csv file each time the method was called, and that it was basically just a "getter" on a class attribute -- but I realize now that the opposite assumption may also be natural.

I think some tests are failing on the CI because there's no API key? How do I update the fixtures without committing my API key to git? Just set the key to 00000000... ?

@hydrosquall
Copy link
Owner

Hi @RJTK , we use a project called vcr.py so you can record the JSON response locally, and push that response into the tests/fixtures directory.
#32

Once you have your fixture, you can use the API key cleaning tool introduced in to replace your API key from the response with the 0000... dummy API key,.

#107

The discoverability of our docs/test-writing structure can be improved, but for now everything you need should be in the tips section: https://github.com/hydrosquall/tiingo-python/blob/master/CONTRIBUTING.rst#tips .

We could also just get rid of this caching -- my assumption was that it wouldn't be asking the website for the csv file each time the method was called, and that it was basically just a "getter" on a class attribute -- but I realize now that the opposite assumption may also be natural.

We've erred on the side of not caching so as to avoid returning stale results. If we do have caching as an option, I'd encourage us towards having it be disabled by default so that anyone who currently expects to get fresh results on every call won't have to edit their programs to use the new parameter (this ensures backwards compatibility).

Let me know if I can help with anything else. Thanks for following up on this!

@codecov
Copy link

codecov bot commented Mar 14, 2021

Codecov Report

Merging #596 (b34e705) into master (ff845a2) will decrease coverage by 10.99%.
The diff coverage is 89.09%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master     #596       +/-   ##
===========================================
- Coverage   93.72%   82.73%   -11.00%     
===========================================
  Files           5        5               
  Lines         255      249        -6     
===========================================
- Hits          239      206       -33     
- Misses         16       43       +27     
Impacted Files Coverage Δ
tiingo/api.py 80.37% <89.09%> (-12.81%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ff845a2...b34e705. Read the comment docs.

@RJTK
Copy link
Author

RJTK commented Mar 14, 2021

I removed caching for the ticker symbols and commit the test fixtures.

Copy link
Owner

@hydrosquall hydrosquall left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @RJTK , this is looking good, thanks for looking into VCR.py , the CI is nice and 🟢 now. Now that they're passing, I was able to take a closer look at the actual implementation. Let me know if you have any questions.

As an aside, the good work you did with consolidating the response formatters will make it easier for us to centralize having friendlier error messages when the response turns out to be something besides the intended JSON , so thank you for having the foresight to improve that! #159

tiingo/api.py Outdated Show resolved Hide resolved
tiingo/api.py Outdated Show resolved Hide resolved
tiingo/api.py Outdated Show resolved Hide resolved
tiingo/api.py Outdated Show resolved Hide resolved
tiingo/api.py Show resolved Hide resolved
tiingo/api.py Outdated Show resolved Hide resolved
tiingo/api.py Show resolved Hide resolved
tiingo/api.py Outdated Show resolved Hide resolved
@hydrosquall
Copy link
Owner

Hi @RJTK , just wanted to check on the status of your PR / if you had any questions.

@RJTK
Copy link
Author

RJTK commented Apr 10, 2021

Hi yes, sorry -- I actually just completely forgot about this since I had what I needed working and moved onto another aspect of the project. Thanks a lot for the feedback -- I will update the PR by tomorrow.

@hydrosquall
Copy link
Owner

hydrosquall commented Apr 11, 2021 via email

@RJTK
Copy link
Author

RJTK commented Apr 28, 2021

What does it mean to run a workflow? From the docs it says that "Forks of public repositories can submit pull requests that propose changes to a repository's GitHub Actions workflows" -- have I modified a file that affects something here?

I changed the test tickers to PG and CAT -- these are part of the DOW30 and seem to be more freely available, so users that don't have or don't have yet premium keys will hopefully have fewer problems.

@hydrosquall
Copy link
Owner

What does it mean to run a workflow? From the docs it says that "Forks of public repositories can submit pull requests that propose changes to a repository's GitHub Actions workflows" -- have I modified a file that affects something here?

I think there was a recent change to how github actions work due to issues with cryptocurrency mining abuse. I've clicked 'approve', so you shouldn't have any issues with this.

https://github.blog/2021-04-22-github-actions-update-helping-maintainers-combat-bad-actors/

For future reference, running a workflow just means running the commands in https://github.com/hydrosquall/tiingo-python/blob/master/.github/workflows/python-package.yml, which happens based on the events specified in these lines:

on:
push:
branches: [ master ]
pull_request:
branches: [ master ]

I changed the test tickers to PG and CAT

Ah, thank you! FWIW generally users shouldn't need to have a tiingo api account (or even internet connection) to be able to run the unit tests, this is the purpose of recording the API responses as fixtures. With that said, I don't have particular attachment to GOOG or APPL, so switching to these tickers is fine with me.

From the looks of it, there might be some additional tests to update

https://github.com/hydrosquall/tiingo-python/pull/596/checks?check_run_id=2487772634

The intraday pricing cassette that was re-recorded also needs to have its ticker updated from GOOG since it's used by the pandas test suite too.

@vcr.use_cassette('tests/fixtures/intraday_price.yaml')
def test_intraday_ticker_price(self):
"""Test the EOD Prices Endpoint with data param"""
prices = self._client.get_dataframe("GOOGL",
startDate="2018-01-02",
endDate="2018-01-02",
frequency="30Min")

@hydrosquall
Copy link
Owner

Hi @RJTK - just doing some project maintenance and wanted to see if you might have the chance to look into the current test issues . My guess is that since you changed the test tickers, the old cassette fixtures will need to be regenerated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants