Subscribe to Our Newsletter


Update Robots.txt in Layout & Design Tool

You can now easily allow or disallow user agents in Global Settings

Robots.txt can be updated from our Layout & Design tool. You can now easily allow or disallow user agents within Layout & Design tool's Global Settings as well.

What Is a Robots.txt File?

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is mainly used to avoid overloading your site with requests. It's not a mechanism for keeping a web page off of Google. To keep a web page off of Google, you should use noindex directives or protect your page with a password.

What Is Robots.txt Used For?

Robots.txt is primarily used to manage crawler traffic to your site, and usually to keep a page off of Google, depending on the file type.

If you have any questions about this feature, email

NEW! Beta Posts Dashboard: Filtering and Search UI Upgrade

New Filters in Posts Dashboard

We've implemented new filters that allow users to easily find exactly what they need in their posts dashboard. You can now customize your search results by:

  • Creators You Follow
  • Rating
  • Post Status
  • Post Type
  • Period
  • Sections
Here's what the new filter menu looks like:

Easy-to-Use Search Queries

In the same slick user interface, you can also create search queries and combine them with your filtered results:

  • Post Search
  • Creator Search With Ajax Autocomplete
  • Community Search

This is an exciting step forward in enhancing the editorial experience for our users. It should now be much simpler to narrow down results to exactly what you're looking for.

Here's how the workflow looks in action:

If you have any questions about these dashboard updates, please email

Subscribe to Our Newsletter