You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 19, 2022. It is now read-only.
Hi,
I am using snakebite client to fetch files from single node cluster. Despite the fact that snakebite cli is perfectly fetching entire file, I am unable to fetch entire file from Hadoop using readBlock() method. Regardless the file, the transfer fails after getting exact 15K bytes.
Under my cluster settings, file block size is 128MB while all the files that I have placed in cluster are under 65MB (thus each file having single HDFS block). Is there anything that I am missing? Thanks
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi,
I am using snakebite client to fetch files from single node cluster. Despite the fact that snakebite cli is perfectly fetching entire file, I am unable to fetch entire file from Hadoop using readBlock() method. Regardless the file, the transfer fails after getting exact 15K bytes.
Under my cluster settings, file block size is 128MB while all the files that I have placed in cluster are under 65MB (thus each file having single HDFS block). Is there anything that I am missing? Thanks
The text was updated successfully, but these errors were encountered: