autossh + SSH Keepalive: The -M 0 Trick

autossh has its own connection monitoring that opens an extra port for heartbeats. But this is redundant — SSH already has native keepalive via ServerAliveInterval. The extra port can even cause problems if it's not available on the remote side.

The modern approach: disable autossh's built-in monitoring with -M 0 and let SSH's native heartbeats handle it. When SSH detects a dead connection (after ServerAliveInterval * ServerAliveCountMax seconds of no response), autossh automatically reconnects.

Put everything in ~/.ssh/config:

Host gateway
    HostName your-gateway-ip
    User ec2-user
    ServerAliveInterval 30
    ServerAliveCountMax 3
    LocalForward 8080 internal-soap-ec2:8080

Then the command reduces to:

autossh -M 0 -Nf gateway

-M 0 is the only autossh-specific flag you need. Everything else — host, user, keepalive, even the tunnel — lives in SSH config where it belongs. ServerAliveInterval 30 sends a heartbeat every 30 seconds through the SSH port itself (no extra ports needed), and ServerAliveCountMax 3 means 90 seconds of silence triggers a disconnect detection.

Newline in OpenAI Codex CLI: It's Ctrl-J, Not Shift+Enter

Shift+Enter doesn't insert a newline in the OpenAI Codex CLI. This isn't a terminal or environment issue — you'll see the same behavior whether you're using iTerm2, the macOS Terminal app, or anything else. It's built into Codex itself.

The shortcut for a new line is Ctrl-J.

This catches people off guard because Shift+Enter is the standard multiline shortcut in virtually every chat and code editor (Claude Code, ChatGPT web, VS Code, etc.). The muscle memory is strong and there's currently no way to remap it — Codex doesn't expose a keybinding config.

AWS SSO login on a headless server? Use `--use-device-code`

AWS CLI v2.22.0+ switched the default SSO login flow to PKCE, which requires a browser on the same machine. If you're SSH'd into a server with no GUI, aws sso login just hangs waiting for a browser that doesn't exist.

The fix is two flags:

aws sso login --profile prod --no-browser --use-device-code

It prints a URL and a one-time code. Open the URL on any other device (phone, laptop, tablet), enter the code, authenticate, and you're done. The CLI polls in the background and picks up the token automatically.

I Automated My Blog Publishing Pipeline with a Claude Skill and Three Shell Scripts

I write a lot of short tech notes — things I learn during daily work that are worth remembering. For years the workflow was: learn something → forget to write it down → never find it again.

Then I built a pipeline: a Claude Code skill writes the note, a script publishes it to my Chyrp Lite blog, and a cron job keeps the session alive. Now I type /record and it's done — note saved, blog posted, zero friction.

The Pipeline

Three pieces, each doing one thing:

  1. /record skill — Claude writes a tech note from the current conversation
  2. pub_tech_note — publishes the markdown file to my blog via curl
  3. refresh_chyrp_token — re-logs into the blog monthly to keep the session cookie fresh

The Skill

The skill is a SKILL.md file that tells Claude how to distill a conversation into a short article. Key rules: find the one non-obvious insight, pick a title that makes someone click, write like a colleague's quick tip not documentation.

After writing the file, the skill runs ~/bin/pub_tech_note <filepath> to publish automatically.

The Publishing Script

Chyrp Lite has no API. The admin panel is just HTML forms. But that means you can automate it with curl — fetch the CSRF hash, build a multipart form POST, and send it.

The tricky part was getting the multipart body right:

…more

Automating Chyrp Lite blog posts with curl

Chyrp Lite has no API — the admin panel is the only way to create posts. But since it's just standard HTML forms, you can automate it entirely with curl. Here's the full pipeline: auto-login, extract CSRF token, create posts.

Step 1: Auto-login to get a session cookie

The login form at /login/ uses standard URL-encoded POST with a CSRF hash. You need to fetch the login page first to extract the hash from the hidden field, then POST credentials:

# Fetch login page, extract CSRF hash
LOGIN_HTML=$(curl -s -c /tmp/chyrp_cookies.txt https://blog.example.com/login/)
HASH=$(echo "$LOGIN_HTML" | grep -oP 'name="hash"\s+value="\K[^"]+')

# POST login
curl -s -D /tmp/chyrp_headers.txt -b /tmp/chyrp_cookies.txt -c /tmp/chyrp_cookies.txt \
  -X POST https://blog.example.com/login/ \
  -H 'content-type: application/x-www-form-urlencoded' \
  --data-raw "login=${USER}&password=${PASS}&hash=${HASH}&submit="

# Extract session token
TOKEN=$(grep -oP 'ChyrpSession=\K[^;]+' /tmp/chyrp_headers.txt /tmp/chyrp_cookies.txt | head -1)

Step 2: Create a post via multipart form POST

The add_post endpoint expects multipart/form-data with CRLF line endings. The same CSRF hash from step 1 is used here too. Watch out for shell-special characters in your title/body — always pass user content through printf %s to avoid backtick expansion:

…more