
THE LOGICAL BOX
AI news & training for business owners & operators. One email. One clear next step.
THIS WEEK IN AI
This was the week AI stopped pretending to be a chat tool. OpenAI shipped a model built to complete work, not help with it. Google built the chips to run those agents at scale. And Meta announced it is cutting 8,000 jobs while spending $115 billion on AI infrastructure. Three stories, one shift.
In this issue:
OpenAI's GPT-5.5 lands, and the pitch is no longer smarter chatbot
Google's new TPU 8t and 8i chips signal where the money is going
Meta cuts 10% of its workforce while pouring billions into AI
The Deep Cut: What "AI as workforce" actually means for a business with 30 employees
In The Move: One thing you can do this week
THE SIGNAL
What happened in AI this week

Image Source: ChatGPT by Andrew Keener
What happened:
OpenAI released GPT-5.5 on Thursday, just six weeks after GPT-5.4. Greg Brockman called it "a new class of intelligence" and "a big step towards more agentic and intuitive computing." The pitch is different this time. Instead of leading with benchmark scores, OpenAI led with outcomes. The model handles messy, multi-step work, uses your tools, checks its own work, and keeps going without you driving every step. Early enterprise testers reported saving up to 10 hours a week on review and analysis tasks.
Why it matters to your business:
This is no longer a chat product. OpenAI is selling an agent, and the language gives it away. "Hand it a messy task." "It plans the steps." "It works toward a result." If your team is still using AI as a fancy autocomplete, you are using last year's playbook. The companies that will pull ahead this year are the ones that figure out which workflows can be handed off, not just sped up. That requires knowing the workflow first, which is where most owners get stuck.
Source: TechCrunch

Image Source: ChatGPT by Andrew Keener
What happened:
Meta told employees Thursday that it is laying off roughly 10% of its workforce, about 8,000 people, with cuts taking effect May 20. The company also closed 6,000 open roles. The same memo confirmed Meta is spending at least $115 billion on AI infrastructure in 2026. Mark Zuckerberg said earlier this year that "projects that used to require big teams now be accomplished by a single very talented person." Microsoft announced buyouts the same week. Amazon has cut 30,000 roles since October. Over 92,000 tech workers have been laid off in 2026 so far.
Why it matters to your business:
When Meta and Microsoft cut headcount, it makes the news. When a 50-person company quietly does not backfill a role, it does not. But the math is the same. The question for owners right now is not "will AI replace jobs at my company." It is "do I know which roles I am about to post that AI could already do half of." If you do not know the answer, you are about to spend $60K to $90K finding out the slow way.
Source: CNBC

Image Source: ChatGPT by Andrew Keener
What happened:
At Google Cloud Next, Google announced its eighth generation of custom AI chips, splitting them into two products. The TPU 8t is built for training. The TPU 8i is built for inference, which is the work AI does after it has been trained. Google claims up to 3x faster training, 80% better performance per dollar, and the ability to link more than a million chips in a single cluster. Both are positioned squarely at agent workloads, where models reason through problems and run multi-step tasks in continuous loops.
Why it matters to your business:
You will not buy a TPU. But the entire AI infrastructure stack is now being rebuilt around agents that act on their own. That means cheaper, faster AI services running underneath the tools you already use, like ChatGPT, Gemini, Copilot, and your CRM. The cost of running AI is dropping fast. Whatever quote a vendor gives you today for an "AI add-on," expect it to drop or expand in capability within 12 months. Do not sign multi-year deals based on today's pricing.
Source: TechCrunch
THE DEEP CUT
What it actually means for your business
AI is now a hiring decision, not a tool decision
For most of the last two years, the AI conversation in small and mid-sized businesses sounded like a tool conversation. Which platform should we use. ChatGPT or Gemini or Claude. Should we buy Copilot. What about agents. The framing was always: what software do we add.
That framing is over.
This week made it official. OpenAI is selling agents that complete work. Google is building the chips to run them at scale. Meta is showing what happens when a company actually believes the marketing. They stopped buying tools and started restructuring the org chart.
Here is the part most owners have not caught up to yet. Every open role at your company right now is an AI decision, whether you treat it that way or not. When you post a job, you are making a bet that this work needs a person. Sometimes the answer is yes. Sometimes the answer is "maybe half of it." And sometimes the honest answer is "we have not looked at this job in five years, and we are about to spend $60,000 to keep doing it the way we have always done it."
That is not a tool problem. That is a clarity problem.
A while back I worked with a contractor whose estimate process took three hours per bid. They were about to hire an estimator. We pulled apart the workflow, found the parts that were genuinely judgment work and the parts that were spreadsheet shuffling, and put the shuffling on AI. The process went from three hours to twenty minutes. They did not need the hire.
That story is not about AI being magic. It is about somebody actually looking at the work before adding a person to it.
If I sat down with you and watched how work actually gets done at your company, I would bet on three things showing up. There is at least one role where the job has quietly drifted into mostly-coordination, mostly-reporting, mostly-status-updates. There is at least one workflow that nobody has documented because the person doing it just knows. And there is at least one open role you are about to post where you have not asked the question "what part of this actually needs a person."
If you said yes to any of these, that is normal. Most companies are running on workflows that grew up organically and never got a second look.
The shift this year is not whether to use AI. It is whether to keep treating AI as a tool you bolt on, or as a question you ask before every hire. Meta is asking the question loudly. The companies that do not ask it quietly are the ones that will be paying for work AI could already do.
Fix the work first. Then decide if you need the hire.
THE MOVE
One thing you can do this week
Pull up the most recent job description you posted, or one you are about to post. Open it next to a blank document and walk through it task by task.
For each responsibility on the list, write one of three things next to it:
Person required. This is judgment, relationship, or accountability work that a human has to own.
AI assists. A person still owns it, but an AI tool can cut the time in half.
AI handles. With the right setup, this could run without a person doing it directly.
Now look at the ratio. If "AI assists" and "AI handles" together make up more than 40% of the job, the role you are posting is not the role you actually need. You either need a smaller role, a different role, or a system instead of a hire.
You do not need to act on it this week. Just see it.
THAT’S A WRAP!
If you have an open role right now, I can tell you which parts of that job actually need a person. It takes 70 minutes and costs $500. Before you spend $60,000 on a hire, spend $500 to find out what you are really hiring for.
Book here: cal.com/aiandrew/beforeyouhire
Thanks for reading,
Andrew Keener
Operations & AI Strategist
Keen Alliance Consulting
Please share The Logical Box link if you know anyone else who would enjoy!


