I just discovered, that popular SEO framework overwrites robots.txt for some security reasons. https://github.com/sybrew/the-seo-framework/issues/339 Fix is set to higher priority filter to 11. add_filter( 'robots_txt', array( __CLASS__, 'robots_txt' ), 11 ); Thank you.