-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LSH Implementation with TFIDF Dense Matrix #101
Comments
Yes, FALCONN supports dense data. In fact, the support for dense data is better than for sparse data. But if your data is very high-dimensional, the dense approach might not be efficient. What dimension do you work with? |
I am currently working with a dataset that stores the TF-IDF values for only those terms that occur in the particular document. So, every point will have different dimension. |
In that case, using a sparse representation might be better. |
can you explain the reason behind that? I am still wondering why sparse representation can perform better than the dense one! |
With a dense representation, the code will be performing many unnecessary multiplications with zero. |
I am currently working on Documents similarity project. We are processing text documents to generate TFIDF Vectors for each document in the corpus. In a nutshell, we are working with DENSE DATA with the documents being the data points and TFIDF values of the terms occuring in the document as their features.
We succeeded in implementing LSH with sparse data but it's not quite efficient.
Is it possible to use FALCONN with dense data for LSH implementation?
The text was updated successfully, but these errors were encountered: