You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Hi, thanks for the template, really great stuff. I've got everything working perfectly however the robots.txt doesn't seem accessible either by me or crawlers (google etc).
Error I get locally:
Error I get on live site:
The logs:
TypeError: Cannot read properties of undefined (reading 'body') at g (/var/task/.next/server/app/[locale]/[authors]/page.js:1:15873) at eh (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:134786) at e (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:137671) at ek (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:138145) at Array.toJSON (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:135755) at stringify (<anonymous>) at eR (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:142219) at eE (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:142698) at Timeout._onTimeout (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:135475) at listOnTimeout (node:internal/timers:581:17) { digest: '3283440806' }
Expected behavior
Expect to able to see the robots.txt and for indexers to crawl it
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Thanks!
The text was updated successfully, but these errors were encountered:
Describe the bug
Hi, thanks for the template, really great stuff. I've got everything working perfectly however the robots.txt doesn't seem accessible either by me or crawlers (google etc).
Error I get locally:
Error I get on live site:
The logs:
TypeError: Cannot read properties of undefined (reading 'body') at g (/var/task/.next/server/app/[locale]/[authors]/page.js:1:15873) at eh (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:134786) at e (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:137671) at ek (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:138145) at Array.toJSON (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:135755) at stringify (<anonymous>) at eR (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:142219) at eE (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:142698) at Timeout._onTimeout (/var/task/node_modules/next/dist/compiled/next-server/app-page.runtime.prod.js:12:135475) at listOnTimeout (node:internal/timers:581:17) { digest: '3283440806' }
Expected behavior
Expect to able to see the robots.txt and for indexers to crawl it
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Thanks!
The text was updated successfully, but these errors were encountered: