Robots.txt for crawlers

Would be nice to have robots.txt for crawlers / index / search engines

For example to exclude "staging" server from being indexed.

So it would mean different having behaviour of the Nuxt robots-module (?) depending on deploy environment: https://github.com/nuxt-community/robots-module