As far as I can tell, we're currently using robots.txt to block untranslated docs pages from being indexed by Google. Looking through Search Console, it appears that is not the correct approach and these pages are still being indexed:

That warning links to: https://support.google.com/webmasters/answer/7440203#indexed_though_blocked_by_robots_txt
Which suggests using the noindex meta tag or header instead: https://support.google.com/webmasters/answer/93710