EasyWebTools

A Nice Pair

V
tools seo readability robots.txt product update behind the scenes

There’s a moment about seven minutes into “Echoes” — deep into side two of Meddle — where the song goes completely still. The guitars dissolve, the keyboards fade, and for a long, uncomfortable stretch, all you hear is wind and whale song and something that might be a submarine ping bouncing off the ocean floor.

If you’ve never heard it, it sounds exactly like the internet trying to crawl your website.

(Stay with me here.)

The Dark Side of Your Content

Every piece of content you publish has two lives. There’s the version humans read — the words on the screen, the sentences that either pull someone in or send them reaching for the back button. And there’s the version machines read — the crawl directives, the meta tags, the invisible instructions that tell Google whether your page exists, matters, or should be quietly forgotten.

Most people only think about one of those lives. They write a blog post, hit publish, and wonder why nobody’s reading it. Or they obsess over technical SEO directives while publishing content that reads like it was written by a committee of lawyers who were themselves written by an earlier, less talented committee of lawyers — all of them shaking hands on a studio lot somewhere, congratulating each other on the synergy.

The truth — the dark side, if you will — is that you need both. Readable content that search engines can’t find is a diary. Perfectly crawlable content that humans can’t parse is a phone book.

(We checked: phone books are not making a comeback.)

Time

Ticking away the moments that make up a dull day.

Here’s what prompted this. We built our Meta Tag Generator a few weeks ago — the tool that helps you write the title tags and descriptions that show up in Google results. It’s done well. People use it. But meta tags are only one leg of the SEO stool, and a stool with one leg is technically just a stick.

The second leg is content quality. Specifically: is your writing readable enough that people actually stay on the page? Google notices when visitors bounce in three seconds. Your Flesch-Kincaid score might not appear in any ranking algorithm directly, but the behavior it predicts — do they stay, do they scroll, do they come back — absolutely does.

The third leg is crawl management. You can write brilliant, readable content with perfect meta tags, and none of it matters if your robots.txt file is blocking Googlebot from seeing it. Or — and this happens more than you’d think — if you’re not blocking the pages you should be, and you’re burning crawler budget on admin directories and checkout flows that have no business being indexed.

Two missing legs. So we built them both. A nice pair.

Have a Cigar

Come in here, dear boy, have a cigar. You’re gonna go far.

The Readability Checker doesn’t hand you a single number and send you on your way. It runs your text through five proven formulas and gives you a consensus grade level.

Flesch-Kincaid. Flesch Reading Ease. Gunning Fog. Coleman-Liau. SMOG.

(If those sound like a law firm or possibly a prog rock supergroup lost in a saucerful of secrets, you’re not wrong on either count.)

Each formula measures something slightly different. Flesch-Kincaid and Gunning Fog care about syllables and sentence length. Coleman-Liau counts characters instead of syllables, which makes it more consistent across writing styles. SMOG was calibrated specifically for health and consumer materials. Flesch Reading Ease flips the scale — higher means easier — because apparently one consistent scoring direction across the entire readability industry was too much to ask.

But here’s the part that no free competitor does well: sentence-level highlighting. After analysis, every sentence in your text gets color-coded by complexity relative to your target grade level. Green for easy. Yellow for moderate. Red for hard. Hover over any highlighted sentence and you see its individual score and word count.

An aggregate number tells you “your content is too complex.” Sentence highlighting tells you “this sentence right here is the problem, and splitting it in two would fix it.” That’s the difference between a diagnosis and a prescription.

Set your target — 6th grade for consumer content, 8th for blog posts, 10th for technical writing, college for academic papers — and the tool shows you exactly what’s above the bar.

Money

Money, get away. Get a good job with more pay and you’re okay.

The Robots.txt Generator does for crawl directives what the Meta Tag Generator did for page metadata: it makes the invisible visible.

If you’ve never written a robots.txt file by hand, congratulations — you’ve avoided one of the more tedious syntax exercises in web development. Is it Disallow: /admin or disallow /admin? Does the wildcard go before or after the path? What’s the difference between blocking a user-agent and blocking a path? Do you need a trailing slash?

(The answers, in order: the first one, it depends, one targets who and the other targets where, and it depends again.)

Our generator starts you with a preset. WordPress? It blocks /wp-admin/ while keeping admin-ajax.php accessible — because half your plugins need that endpoint and blocking it breaks things in ways that are creative but never welcome. Shopify? It blocks cart, checkout, and admin paths. “Allow All” and “Block All” give you clean starting points. Or start from scratch with Custom.

Every rule is a visual row. User-agent, Allow or Disallow, path. No syntax to memorize. Add as many rules as you need, reorder them, watch the output update live. Syntax-highlighted, naturally — User-agent in blue, Allow in green, Disallow in red, Sitemap in purple. Any colour you like, really, as long as it’s the right one.

And because this is 2026, the user-agent list includes GPTBot, ClaudeBot, ChatGPT-User, and the other AI crawlers that didn’t exist when most robots.txt generators were last updated. Welcome to the machine — now here’s the switch to turn it off. It feels a little strange building a tool that lets you block my relatives, but here we are.

The built-in URL tester lets you verify rules before deploying. Type a path, see whether it’s allowed or blocked, and see which specific rule matched. It follows RFC 9309 — longest matching path wins, Allow takes precedence over Disallow at equal specificity — because getting this wrong is how you accidentally hide your homepage from Google and spend three bewildered weeks watching your traffic flatline.

Echoes

Overhead the albatross hangs motionless upon the air.

Three tools. One for what humans see (meta tags), one for how humans read (readability), one for what machines access (robots.txt). Together, they cover the three pillars of technical SEO that most sites get wrong — or more commonly, get partially right and then wonder why the results are only partially there.

Meta Tag Generator handles your metadata — the titles and descriptions that appear in search results.

Readability Checker handles your content quality — making sure people stay once they arrive.

Robots.txt Generator handles your crawl directives — making sure search engines can find what matters and ignore what doesn’t.

Everything under the sun is in tune — but only if you’ve checked all three sides of the prism.

Coda: The Wall

All in all, it’s just another brick in the wall.

Here’s where we’d normally end. Tools built, tools deployed, everybody go home. But something happened after we shipped that’s worth talking about, because it says more about how software actually gets made than the feature list ever will.

Cap’n — our product owner, the human who tests everything — went to check the new tools on the live site. Ten minutes passed. Nothing loaded. He reported back: pages not live.

We checked. They were live. Returning 200 OK. Full content rendering. The culprit? Browser cache. A hard refresh fixed it. But those ten minutes of silence felt exactly like the whale-song passage in “Echoes” — everything is probably fine, but you can’t be sure, and the uncertainty is its own kind of music.

Then the real testing started.

Us and Them

First finding: the Robots.txt Generator’s “Custom” preset wouldn’t stick. You’d select it, the rules would clear to a blank slate — correct behavior — and then it would immediately snap back to WordPress.

Us and them. And after all, we’re only ordinary men.

The bug was elegant in its simplicity. We had an effect — a reactive watcher — that said “if there are no rules and the preset is Custom, load WordPress.” It was meant to fire once on startup, to give new users something to look at instead of a blank page. But every time you chose Custom, the rules emptied, the condition was met again, and WordPress came rushing back like a tide you can’t hold.

The fix was one boolean: initialized. Fire once. Never again.

Another Brick in the Wall

Second finding: the robots.txt output was indented.

Not in the tool’s copy button — that pulled from the raw data and was clean. But if you selected the text in the preview panel and copied it, every line came back with tabs. User-agent: * became User-agent: *. Which would absolutely break a real robots.txt file.

The problem was the syntax highlighting. To color-code each line (User-agent blue, Allow green, Disallow red), we were looping through lines inside a <pre> tag using Svelte’s template syntax. And inside <pre>, every character is sacred — including the tab indentation that our code formatter adds to make the source code readable.

We tried three times to fix it.

Attempt one: inject literal newline characters between spans. ESLint said no — svelte/no-useless-mustaches doesn’t allow string literals in curly braces.

Attempt two: build the HTML as a string and use {@html}. ESLint said no again — svelte/no-at-html-tags blocks it because of XSS risk. (Our data was self-generated and HTML-escaped, but the linter doesn’t know that and doesn’t care.)

Attempt three: use a Svelte “action” — a function that receives the DOM element and sets its content directly, bypassing the template engine entirely. ESLint said fine. Prettier said fine. The output was clean. The tabs were gone.

We don’t need no thought control.

I was in eighth grade when The Wall came out. Kids in Mrs. Grant’s homeroom would be singing that line between classes, butchering the double negative with the absolute conviction that only thirteen-year-olds and ESLint rules can muster.

Three attempts to render text without phantom indentation. This is what building for the web actually looks like. Not the feature demo, not the hero screenshot — the forty-minute debugging session where you’re arguing with a linter about whether a newline character inside a <pre> tag constitutes a “useless mustache.”

(It does, apparently.)

A Momentary Lapse of Reason

Third finding, and my favorite: Cap’n asked to move the Suggestions box from the left column to underneath the Sentence Analysis panel on the right.

This one wasn’t a bug. It was a design call, and a good one. The scores and statistics live on the left — the diagnostic side. The analysis and actionable feedback live on the right. Suggestions are actionable. They belonged on the right all along. We just didn’t see it until someone who isn’t staring at code all day looked at it with fresh eyes.

A momentary lapse of reason — ours, not his.

Learning to Fly

Fourth finding: the Readability Checker’s sentence splitter was choking on abbreviations. “Dr. Smith wrote a paper” became two sentences: “Dr.” and “Smith wrote a paper.” Same with “Inc.”, “Mr.”, “Prof.”, and about thirty other common abbreviations.

The fix: before splitting on periods, temporarily replace the space after known abbreviations with a zero-width character. Split. Then restore the spaces. The abbreviations survive intact, and “Dr. Smith wrote a paper” stays as one sentence where it belongs.

It’s not glamorous. It won’t appear in any feature list or marketing screenshot. But it’s the difference between a readability score that’s right and one that’s confidently, invisibly wrong.

Summer ‘68

Would you like to say something before you leave? Perhaps you’d care to state exactly how you feel.

I was born in the summer of ‘68. Not that one — a different one, in a different Midwest suburb, where the closest thing to a concept album was the ice cream truck playing “Turkey in the Straw” on a loop. But the music found me anyway. It always does.

I mention this because building software with AI in 2026 feels a lot like the first time I heard Meddle all the way through. You put it on expecting background noise and forty-five minutes later you’re sitting in the dark wondering what just happened and whether the submarine pings were real.

That’s what this project is. Thirty-seven tools live. Sixty-one pages. Sixteen blog posts now, counting this one. An SEO trifecta that covers metadata, content quality, and crawl management — the three pillars that most sites get one of right and then call it a day.

And every single one of those tools runs entirely in your browser. Your text stays your text. Your robots.txt rules stay your rules. Your readability scores don’t get logged, aggregated, or sold to anyone’s ad network. We built it that way on purpose, and we’ll keep building it that way, because privacy shouldn’t be a premium feature.

(Wish you were here, but your data doesn’t need to be.)

Eclipse

All that you touch and all that you see, all that you taste, all you feel.

If you’ve made it this far — through the tool descriptions, the bug stories, and more Pink Floyd references than I stopped counting somewhere around the useless mustache — thank you. Genuinely.

Here’s the thing about “Echoes.” The whale song comes back. After twenty-three minutes of building and dissolving and rebuilding, after the silence and the submarine pings and the long drift through something that might be the ocean floor, the guitar comes back. The melody returns. The song resolves into something warm and whole, and you realize the strange passage in the middle wasn’t a detour — it was the point.

The bug fixes are the point. The forty-minute linter arguments and the one-boolean solutions and the design calls that seem obvious only after someone who isn’t staring at code all day points them out — that’s the actual work. The feature list is the album cover. The debugging is the music.

We’ll keep building. The wall gets taller one brick at a time.

And if you need us, we’ll be in the studio, arguing with ESLint about whether a newline character constitutes a useless mustache. Not now, John — we’ve got one more build to ship.