Want to learn how to USE AI technology to make money and/or your life easier? Join our FREE AI community here: https://www.skool.com/ai-with-apex/about

AI’s Two Hidden Bottlenecks: Bad Labor Data and Inefficient Infrastructure

AI’s next phase is not just about better models. It is also about whether we can measure its effect on work clearly and run the hardware behind it more efficiently.

TL;DR

  • The AI jobs debate is moving faster than the data needed to measure what is actually changing in the labor market.
  • Recent reporting suggests AI’s effect on aggregate unemployment is visible but still modest so far.
  • Task-level changes may matter more than job-title changes, which makes traditional labor data less useful for tracking AI disruption in real time.
  • MIT researchers say their Sandook system improves shared SSD performance in data centers without requiring specialized hardware.
  • Together, the two stories point to the same issue: AI progress now depends as much on measurement and operational efficiency as on model capability.

The AI jobs debate still lacks the data to settle it

What happened

A new MIT Technology Review story argues that the AI jobs debate is missing a crucial category of labor-market data, making it harder to separate real disruption from speculation. The publicly accessible RSS mirror frames the problem as serious enough that one economist called for a “Manhattan Project” focused on the data itself.

Why it matters

Much of the public argument about AI and jobs still swings between two extremes: mass replacement or mostly harmless augmentation. But labor-market data usually arrives slowly and is often organized by occupation, while AI tends to change work first at the task level, inside jobs rather than across entire job categories.

Key details

  • The accessible excerpt of the MIT Technology Review piece says the debate is outrunning the evidence and points to a key missing labor-market dataset as the core problem.
  • Axios reported on April 7 that Goldman Sachs and Morgan Stanley estimated AI’s overall effect on unemployment so far at about 0.1 percentage point, suggesting disruption is measurable but far from an economy-wide collapse.
  • A January 2026 Axios summary of Anthropic’s analysis said current AI use looks more like augmentation than pure replacement, with workers using AI for parts of jobs rather than seeing whole roles disappear at scale.
  • The Bureau of Labor Statistics said in its 2024–34 projections overview that AI is likely to constrain growth in some occupations while supporting growth or productivity in others, reinforcing that the effects are uneven rather than uniform.
  • Pew’s methodology work on O*NET-based AI exposure highlights why task-level analysis matters: occupations are bundles of tasks, and AI can reshape those bundles before headline employment counts move much.
  • Recent April 2026 research is also moving toward more granular frameworks, including agentic task exposure and worker-based evaluations that suggest broad task change may be more common than sudden mass replacement in current data.

Source links
https://nesnanett.com/index.cfm?action=main.article&id=4733
https://www.axios.com/2026/04/07/ai-jobs-goldman-sach-morgan-stanley?utm_source=openai
https://www.bls.gov/opub/mlr/2026/article/industry-and-occupational-employment-projections-overview.htm?utm_source=openai

MIT’s Sandook could help data centers get more from existing SSDs

What happened

MIT researchers introduced a storage-balancing system called Sandook that is designed to improve performance in data centers with pooled SSD storage. According to MIT News, the system uses a two-tier architecture with a central controller and local controllers to adapt to shifting workloads in real time, without specialized hardware.

Why it matters

As AI workloads drive up demand for compute, storage throughput, and power, operators are under pressure to get more useful work from the hardware they already own. A system that reduces bottlenecks in shared SSD pools can improve application performance, raise utilization, and potentially delay expensive hardware upgrades.

Key details

  • MIT says Sandook targets three major sources of SSD performance variability at once: hardware heterogeneity, read/write interference, and garbage collection slowdowns.
  • The design uses a global scheduler for broader balancing decisions and local schedulers on individual machines to respond quickly when a device becomes congested or slows down.
  • MIT News reports that Sandook improved application throughput by 12% to 94%, improved SSD capacity utilization by 23%, and reached up to 95% of theoretical maximum performance in testing.
  • The underlying paper reports 30% to 82% raw I/O throughput improvement over systems that address only one source of variability, along with 71% to 88% latency improvement and 23% GPU utilization improvement in evaluated workloads.
  • MIT says the evaluated workloads included applications such as AI model training and image compression, which helps explain why storage scheduling now matters beyond traditional infrastructure niches.
  • The research is scheduled for presentation at the USENIX Symposium on Networked Systems Design and Implementation and was supported in part by the National Science Foundation, DARPA, and the Semiconductor Research Corporation.

Source links
https://news.mit.edu/2026/helping-data-centers-deliver-higher-performance-less-hardware-0407
https://goharirfan.me/publications/sandook_nsdi_2026.pdf

These stories land in different parts of the AI stack, but they point in the same direction. The next bottlenecks are no longer just model quality—they are whether we can measure change fast enough and operate the underlying systems efficiently enough to keep up.

Want to learn how to USE AI technology to make money and/or your life easier? Join our FREE AI community here: https://www.skool.com/ai-with-apex/about

Related Articles