Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[GNMT v2/Tensorflow] API Support for Single Queries (similar to google translate api) #621

Open
abbyDC opened this issue Jul 27, 2020 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@abbyDC
Copy link

abbyDC commented Jul 27, 2020

For translate mode, I think it was optimized for doing batch translations and I can say it is already fast for that use case. The problem is if I just wanted to translate 1 sentence, it takes the same time to process a batch of statements.

Feature wanted:
I wish there is a mode like the google translate api where in I could input a single statement and it will return the corresponding translation of that sentence almost instantly.

**I may be wrong so if this feature is already present, I'd appreciate it if you'd comment it here so I could close the issue.

@abbyDC abbyDC added the enhancement New feature or request label Jul 27, 2020
@mwawrzos
Copy link
Contributor

Can you give an example, how google translate api can be used in the mode you mean?

It would be great if you could share some bash/python script, that shows step by step, how it looks like.

@abbyDC
Copy link
Author

abbyDC commented Jul 31, 2020

@mwawrzos Similar to this example from https://pythonprogramming.net/translation-api-google-cloud-tutorial/. So there is a function which takes in a text and it will return the translation. This is a quick way to get a translation for a single string.

from google.cloud import translate
def translate_text(text,target='en'):
    """
    Target must be an ISO 639-1 language code.
    https://cloud.google.com/translate/docs/languages
    """
    translate_client = translate.Client()
    result = translate_client.translate(
        text,
        target_language=target)

    print(u'Text: {}'.format(result['input']))
    print(u'Translation: {}'.format(result['translatedText']))
    print(u'Detected source language: {}'.format(
        result['detectedSourceLanguage']))
example_text ="Hola saludos" 
translate_text(example_text.decode('utf-8'),target='en')
### OUTPUT ###
"""
Text: Hola saludos
Translation: Hello greetings
Detected source language: es
"""

Issue with current translation: Upon checking the code, it seems like the Estimator.predict() causes the inference to slow down for a single query since it needs to create a graph for each prediction.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants