Famous quotes
"Happiness can be defined, in part at least, as the fruit of the desire and ability to sacrifice what we want now for what we want eventually" - Stephen Covey
Sunday, December 29, 2024
Friday, December 27, 2024
Energy diversification
Article in hindu business Line
--- **Energy Diversification in India's Oil Import Strategy**
India’s energy strategy is witnessing a significant shift as it diversifies its oil import basket to reduce dependence on traditional suppliers from the Middle East. This change is driven by the country’s burgeoning energy demands and the need for long-term energy security.
### **Key Trends and Developments**
1. **Shift in Supplier Share**:
The share of Middle Eastern crude oil in India’s import basket is expected to decline. Russian crude has emerged as a major player, accounting for more than 40% of India’s total imports in 2024, with 1.7 million barrels per day (b/d) from January to September. - Iraq: Second-largest supplier with 940,000 b/d.
- Saudi Arabia: Third-largest with 623,000 b/d.
- The US and UAE also remain key suppliers.
2. **Focus on New Suppliers**:
- India is exploring long-term contracts with emerging oil producers like Guyana, spurred by diplomatic visits and strategic discussions.
- Guyana is being eyed as a potential major supplier, marking India’s foray into South American crude markets.
3. **Increased Refining Capacity**:
- India’s refining capacity is set to rise to meet growing demand. Policymakers are intensifying efforts to diversify crude sources, aiming to balance cost and security.
4. **Energy Security through Diversification**:
- The strategy includes reducing over-reliance on Middle Eastern suppliers and leveraging cheaper alternatives, like Russian crude, offered at discounts.
### **Demand Dynamics**
India is one of the fastest-growing oil consumption centers, surpassing China’s growth rate in 2024:
- Oil demand rose by 3.2% year-on-year, compared to China’s 1.7%.
- Petrochemical feedstock requirements are driving this surge, alongside increased industrial and transportation needs.
### **Strategic Implications**
- India’s diversification aligns with its aim to safeguard energy security amid geopolitical shifts and market volatility.
- The approach allows India to tap into more competitive pricing and secure supplies from a broader range of countries, reducing risks associated with regional instability.
### **Looking Ahead**
India’s energy diversification strategy is poised to reshape its import dynamics, emphasizing sustainability, cost-efficiency, and long-term partnerships with emerging oil-exporting nations. ---
--- **Energy Diversification in India's Oil Import Strategy**
India’s energy strategy is witnessing a significant shift as it diversifies its oil import basket to reduce dependence on traditional suppliers from the Middle East. This change is driven by the country’s burgeoning energy demands and the need for long-term energy security.
### **Key Trends and Developments**
1. **Shift in Supplier Share**:
The share of Middle Eastern crude oil in India’s import basket is expected to decline. Russian crude has emerged as a major player, accounting for more than 40% of India’s total imports in 2024, with 1.7 million barrels per day (b/d) from January to September. - Iraq: Second-largest supplier with 940,000 b/d.
- Saudi Arabia: Third-largest with 623,000 b/d.
- The US and UAE also remain key suppliers.
2. **Focus on New Suppliers**:
- India is exploring long-term contracts with emerging oil producers like Guyana, spurred by diplomatic visits and strategic discussions.
- Guyana is being eyed as a potential major supplier, marking India’s foray into South American crude markets.
3. **Increased Refining Capacity**:
- India’s refining capacity is set to rise to meet growing demand. Policymakers are intensifying efforts to diversify crude sources, aiming to balance cost and security.
4. **Energy Security through Diversification**:
- The strategy includes reducing over-reliance on Middle Eastern suppliers and leveraging cheaper alternatives, like Russian crude, offered at discounts.
### **Demand Dynamics**
India is one of the fastest-growing oil consumption centers, surpassing China’s growth rate in 2024:
- Oil demand rose by 3.2% year-on-year, compared to China’s 1.7%.
- Petrochemical feedstock requirements are driving this surge, alongside increased industrial and transportation needs.
### **Strategic Implications**
- India’s diversification aligns with its aim to safeguard energy security amid geopolitical shifts and market volatility.
- The approach allows India to tap into more competitive pricing and secure supplies from a broader range of countries, reducing risks associated with regional instability.
### **Looking Ahead**
India’s energy diversification strategy is poised to reshape its import dynamics, emphasizing sustainability, cost-efficiency, and long-term partnerships with emerging oil-exporting nations. ---
Manmohan Singh Legacy as per Chat GPT
Manmohan Singh’s legacy is multifaceted, reflecting his roles as an economist, policymaker, and leader who steered India through transformative economic and political phases. Here's a deeper analysis of his contributions:
---
### **Economic Reforms and Liberalization (1991-1996)**
#### **Background**
When Singh became Finance Minister in 1991, India was on the brink of an economic collapse. The Gulf War had worsened the balance of payments crisis, foreign reserves were nearly depleted, and India faced the prospect of defaulting on its loans.
#### **Actions Taken*
1. **Liberalization of the Economy**:
Singh, in collaboration with Prime Minister PV Narasimha Rao, dismantled the License Raj, which had stifled entrepreneurship and innovation for decades. This reform removed bureaucratic hurdles for businesses, encouraging private sector growth.
2. **Trade and Investment Reforms**:
Tariffs and quotas were significantly reduced, allowing greater participation in global markets. Foreign Direct Investment (FDI) was encouraged in critical sectors, fostering economic integration with the global economy.
3. **Financial Sector Modernization**:
Singh implemented critical changes, including the establishment of private sector banks and financial institutions. These reforms led to a more robust banking system and laid the groundwork for India’s capital markets.
4. **Currency Devaluation**:
Singh devalued the Indian rupee to make exports competitive and stabilize the economy. Though politically contentious, this move restored investor confidence.
#### **Impact**
These reforms set the stage for India’s economic resurgence. GDP growth, which had stagnated in the late 1980s, picked up, averaging around 5.1% during his tenure. Industries like IT, pharmaceuticals, and automobiles emerged as global leaders.
--- ### **Leadership as Prime Minister (2004-2014)**
#### **First Term (2004-2009)**
Singh’s first term as Prime Minister was marked by bold policy initiatives aimed at inclusive growth: 1. **Mahatma Gandhi National Rural Employment Guarantee Act (MGNREGA)**: Aimed at providing employment to rural workers, MGNREGA became one of the largest social welfare schemes globally, lifting millions out of poverty.
2. **Right to Information Act**: This law empowered citizens by providing greater transparency in governance and reducing corruption.
3. **Right to Education Act**: Singh’s government made elementary education a fundamental right, ensuring free and compulsory education for children aged 6-14.
4. **India-US Civil Nuclear Agreement**: This landmark deal ended India’s nuclear isolation, allowing access to international nuclear technology and fuel while maintaining its strategic autonomy.
#### **Second Term (2009-2014)**
While Singh’s second term began with promise, it faced significant challenges:
1. **Economic Slowdown**:
Global financial crises and domestic policy inertia led to a decline in economic growth rates.
2. **Corruption Allegations**: Scandals, such as the 2G spectrum and coal block allocation cases, marred his government’s reputation, although Singh himself remained personally untarnished.
3. **Political Challenges**: Coalition politics and pressure from alliance partners undermined the government’s decision-making capacity, slowing the pace of reforms.
---
### **Leadership Traits and Challenges**
1. **Integrity and Dignity**:
Throughout his career, Singh was admired for his incorruptibility and ethical conduct. His reputation as a “gentleman-politician” set him apart in the often-contentious world of Indian politics.
2. **Steadfast Commitment to Policy**: Singh prioritized evidence-based policymaking and sought expert opinions before implementing reforms, valuing consultation over populism.
3. **Lack of Political Aggression**: Critics argue that Singh’s reserved and non-confrontational style limited his ability to assert authority, especially during his tenure as Prime Minister.
---
### **Long-Term Impact on India**
1. **Economic Transformation**:
Singh’s 1991 reforms set India on a path of sustained economic growth, transforming it into one of the world’s fastest-growing economies. His emphasis on liberalization continues to shape policy directions.
2. **Social Equity**:
Programs like MGNREGA and the Right to Education highlighted Singh’s commitment to bridging economic and social disparities.
3. **Global Standing**:
Singh’s tenure elevated India’s global stature. His diplomatic engagements, especially the Indo-US nuclear deal, showcased his ability to navigate complex geopolitical landscapes.
---
### **Legacy**
Manmohan Singh’s legacy is that of a reluctant yet transformative leader who combined intellectual depth with pragmatic action. While his tenure faced criticism, especially during his second term, his contributions to India’s economic and social fabric remain unparalleled. His life exemplifies the power of quiet leadership grounded in values, intellect, and an unwavering commitment to national progress.
---
---
### **Economic Reforms and Liberalization (1991-1996)**
#### **Background**
When Singh became Finance Minister in 1991, India was on the brink of an economic collapse. The Gulf War had worsened the balance of payments crisis, foreign reserves were nearly depleted, and India faced the prospect of defaulting on its loans.
#### **Actions Taken*
1. **Liberalization of the Economy**:
Singh, in collaboration with Prime Minister PV Narasimha Rao, dismantled the License Raj, which had stifled entrepreneurship and innovation for decades. This reform removed bureaucratic hurdles for businesses, encouraging private sector growth.
2. **Trade and Investment Reforms**:
Tariffs and quotas were significantly reduced, allowing greater participation in global markets. Foreign Direct Investment (FDI) was encouraged in critical sectors, fostering economic integration with the global economy.
3. **Financial Sector Modernization**:
Singh implemented critical changes, including the establishment of private sector banks and financial institutions. These reforms led to a more robust banking system and laid the groundwork for India’s capital markets.
4. **Currency Devaluation**:
Singh devalued the Indian rupee to make exports competitive and stabilize the economy. Though politically contentious, this move restored investor confidence.
#### **Impact**
These reforms set the stage for India’s economic resurgence. GDP growth, which had stagnated in the late 1980s, picked up, averaging around 5.1% during his tenure. Industries like IT, pharmaceuticals, and automobiles emerged as global leaders.
--- ### **Leadership as Prime Minister (2004-2014)**
#### **First Term (2004-2009)**
Singh’s first term as Prime Minister was marked by bold policy initiatives aimed at inclusive growth: 1. **Mahatma Gandhi National Rural Employment Guarantee Act (MGNREGA)**: Aimed at providing employment to rural workers, MGNREGA became one of the largest social welfare schemes globally, lifting millions out of poverty.
2. **Right to Information Act**: This law empowered citizens by providing greater transparency in governance and reducing corruption.
3. **Right to Education Act**: Singh’s government made elementary education a fundamental right, ensuring free and compulsory education for children aged 6-14.
4. **India-US Civil Nuclear Agreement**: This landmark deal ended India’s nuclear isolation, allowing access to international nuclear technology and fuel while maintaining its strategic autonomy.
#### **Second Term (2009-2014)**
While Singh’s second term began with promise, it faced significant challenges:
1. **Economic Slowdown**:
Global financial crises and domestic policy inertia led to a decline in economic growth rates.
2. **Corruption Allegations**: Scandals, such as the 2G spectrum and coal block allocation cases, marred his government’s reputation, although Singh himself remained personally untarnished.
3. **Political Challenges**: Coalition politics and pressure from alliance partners undermined the government’s decision-making capacity, slowing the pace of reforms.
---
### **Leadership Traits and Challenges**
1. **Integrity and Dignity**:
Throughout his career, Singh was admired for his incorruptibility and ethical conduct. His reputation as a “gentleman-politician” set him apart in the often-contentious world of Indian politics.
2. **Steadfast Commitment to Policy**: Singh prioritized evidence-based policymaking and sought expert opinions before implementing reforms, valuing consultation over populism.
3. **Lack of Political Aggression**: Critics argue that Singh’s reserved and non-confrontational style limited his ability to assert authority, especially during his tenure as Prime Minister.
---
### **Long-Term Impact on India**
1. **Economic Transformation**:
Singh’s 1991 reforms set India on a path of sustained economic growth, transforming it into one of the world’s fastest-growing economies. His emphasis on liberalization continues to shape policy directions.
2. **Social Equity**:
Programs like MGNREGA and the Right to Education highlighted Singh’s commitment to bridging economic and social disparities.
3. **Global Standing**:
Singh’s tenure elevated India’s global stature. His diplomatic engagements, especially the Indo-US nuclear deal, showcased his ability to navigate complex geopolitical landscapes.
---
### **Legacy**
Manmohan Singh’s legacy is that of a reluctant yet transformative leader who combined intellectual depth with pragmatic action. While his tenure faced criticism, especially during his second term, his contributions to India’s economic and social fabric remain unparalleled. His life exemplifies the power of quiet leadership grounded in values, intellect, and an unwavering commitment to national progress.
---
Thursday, December 26, 2024
NBA teams are not all playing exactly the same way
No, NBA teams are not all playing exactly the same way
And if you think they are, you're not paying attention to the right things
Jared Dubin
Everywhere you look these days, you can hear someone making some form of the same argument: “Every NBA team plays the same way.” It’s everywhere. It’s on TV. It’s on Twitter. It’s made its way over to Bluesky. It’s probably in your group texts and your DMs and your G-chats, if you’re still using that.
And I’m sorry, but it is just flat-out not true on any level. Like, not even close.
In the sense that somewhere between most and all teams shoot a lot of threes, there is a degree of similarity. But there’s a pretty big gap between the team that shoots the most threes (55.7% of Boston's shots come from beyond the arc) and the team that shoots the least (just 34.1% of Denver's shots are treys). And there are much wider disparities in the ways teams go about actually generating their shots — even among those who get similar types of looks.
Chicago takes the second-highest share of its shots from three, for example, but runs around twice as many hand-offs per 100 possessions as does Boston. In turn, the Bulls run both isolation plays and off-ball screens significantly less often than do the Celtics. Despite those differences, they’re pretty similar in terms of their shot profile, and that's all that people tend to focus on.
Take a look at each team’s play type distribution, and you can spot some pretty massive differences in the way they want to attack their opponents
The Grizzlies, as Ben Taylor explored at Thinking Basketball, essentially never run pick and rolls — at least in comparison with the average NBA team. Their 33.6 ball screens per 100 possessions average is not even in the same universe as the Suns' league-leading 83.8 picks per 100 mark. Yes, the Suns really are running more than twice as many pick and rolls per 100 possession than are the Grizz. And they’re actually joined by 11 other teams who are also more than doubling up Memphis in the P&R department.
I’m not sure how someone could see that, whether on their TV or computer screen or in the (advanced) stat sheet, and conclude that they’re all playing the exact same brand of basketball.
Similarly, you can’t look at the Jazz running nearly 70 off-ball screens per 100 possessions, then check out their former coach Quin Snyder’s new team in Atlanta — which runs just 29.3 of those per 100 — and not plainly see the massive difference in the ways those teams want to accomplish the same goal (scoring points), just based on who is coaching them.
Then you've got the Nets using a metric ton of dribble handoffs (48.8 per 100 possessions) to manufacture space for their players because they don't have an elite individual creator, while, as mentioned the Celtics barely need to bother with DHO action (9.3 hand-offs per 100) because they have Jayson Tatum and Jaylen Brown and Kristaps Porzingis and more. Boston just lets those guys isolate to their hearts’ content (27.1 per 100 possessions); but the Warriors (13.3 isos per 100) almost never do the same, instead running their motion offense that we’ve grown so familiar with over the years.
Of course, that’s just play types. They tell us what actions teams are running, but not necessarily how they go about their business offensively. To measure that, I borrowed a concept from friend of the blog Ian Levy, who back in the day created offensive style charts using the public player tracking data. I did the same analysis as Ian, using the following data points for each NBA team:
Pace: Seconds per offensive possession
Shot Selection: Moreyball Rate
Ball Movement: Seconds per touch on offense
Player Movement: Average feet traveled per 24 seconds on offense
I then converted each of those measurements into percentiles so they could be scaled together on radar charts, mostly because I like the way those charts look. And in those, too, you can see that there are so many teams playing basketball so differently from each other.
Check out the league’s three best offenses, for example. Cleveland is going about things in a way that is not remotely similar to either New York or Boston, which play somewhat similarly but also diverge in how quickly they seek shots and how much their players move around on each possession.
You can spot the same kinds of disparities among the NBA’s three worst offenses. New Orleans, Portland, and Washington all use vastly different styles of play to achieve their terrible results. There are a whole bunch of different ways to score inefficiently — which you can even do with a healthy shot distribution like the one the Blazers have.
How about some of the best individual offensive players in the league? Denver plays much differently than Milwaukee, which in turn plays much differently than Oklahoma City. And that makes intuitive sense, because Nikola Jokic, Giannis Antetokounmpo, and Shai Gilgeous-Alexander are not remotely similar players. Their teams play the style of basketball that best fits their skill sets, and they all still find their way to score at a top-10 rate. Similarly, I’ve written before about how, at least in terms of the ways they create offense for themselves and others, Luka Doncic and LeBron James might be the closest thing to each other in the league. And yet, their teams appear to play much differently, and they each are significantly different than, say, the Sixers. And the Sixers would likely be even more different than them both if their star players had been remotely healthy this season.
We can also look at some of the younger teams with highly unusual offenses. Memphis, as mentioned, runs a much different offense than pretty much everyone else in the NBA. But its way of going about things from an offensive theory standpoint is also much different from teams like Orlando and Utah, which are also not similar to each other at all
And honestly, if you’re paying attention, it’s not even all that hard to spot these differences in style of play. If you’re willing to look at anything beyond “everybody shoots threes,” that is.
There are plenty of legitimate critiques of today’s NBA. The idea that everyone in the league is playing the same exact way just isn’t one of them.
And if you think they are, you're not paying attention to the right things
Jared Dubin
Everywhere you look these days, you can hear someone making some form of the same argument: “Every NBA team plays the same way.” It’s everywhere. It’s on TV. It’s on Twitter. It’s made its way over to Bluesky. It’s probably in your group texts and your DMs and your G-chats, if you’re still using that.
And I’m sorry, but it is just flat-out not true on any level. Like, not even close.
In the sense that somewhere between most and all teams shoot a lot of threes, there is a degree of similarity. But there’s a pretty big gap between the team that shoots the most threes (55.7% of Boston's shots come from beyond the arc) and the team that shoots the least (just 34.1% of Denver's shots are treys). And there are much wider disparities in the ways teams go about actually generating their shots — even among those who get similar types of looks.
Chicago takes the second-highest share of its shots from three, for example, but runs around twice as many hand-offs per 100 possessions as does Boston. In turn, the Bulls run both isolation plays and off-ball screens significantly less often than do the Celtics. Despite those differences, they’re pretty similar in terms of their shot profile, and that's all that people tend to focus on.
Take a look at each team’s play type distribution, and you can spot some pretty massive differences in the way they want to attack their opponents
The Grizzlies, as Ben Taylor explored at Thinking Basketball, essentially never run pick and rolls — at least in comparison with the average NBA team. Their 33.6 ball screens per 100 possessions average is not even in the same universe as the Suns' league-leading 83.8 picks per 100 mark. Yes, the Suns really are running more than twice as many pick and rolls per 100 possession than are the Grizz. And they’re actually joined by 11 other teams who are also more than doubling up Memphis in the P&R department.
I’m not sure how someone could see that, whether on their TV or computer screen or in the (advanced) stat sheet, and conclude that they’re all playing the exact same brand of basketball.
Similarly, you can’t look at the Jazz running nearly 70 off-ball screens per 100 possessions, then check out their former coach Quin Snyder’s new team in Atlanta — which runs just 29.3 of those per 100 — and not plainly see the massive difference in the ways those teams want to accomplish the same goal (scoring points), just based on who is coaching them.
Then you've got the Nets using a metric ton of dribble handoffs (48.8 per 100 possessions) to manufacture space for their players because they don't have an elite individual creator, while, as mentioned the Celtics barely need to bother with DHO action (9.3 hand-offs per 100) because they have Jayson Tatum and Jaylen Brown and Kristaps Porzingis and more. Boston just lets those guys isolate to their hearts’ content (27.1 per 100 possessions); but the Warriors (13.3 isos per 100) almost never do the same, instead running their motion offense that we’ve grown so familiar with over the years.
Of course, that’s just play types. They tell us what actions teams are running, but not necessarily how they go about their business offensively. To measure that, I borrowed a concept from friend of the blog Ian Levy, who back in the day created offensive style charts using the public player tracking data. I did the same analysis as Ian, using the following data points for each NBA team:
Pace: Seconds per offensive possession
Shot Selection: Moreyball Rate
Ball Movement: Seconds per touch on offense
Player Movement: Average feet traveled per 24 seconds on offense
I then converted each of those measurements into percentiles so they could be scaled together on radar charts, mostly because I like the way those charts look. And in those, too, you can see that there are so many teams playing basketball so differently from each other.
Check out the league’s three best offenses, for example. Cleveland is going about things in a way that is not remotely similar to either New York or Boston, which play somewhat similarly but also diverge in how quickly they seek shots and how much their players move around on each possession.
You can spot the same kinds of disparities among the NBA’s three worst offenses. New Orleans, Portland, and Washington all use vastly different styles of play to achieve their terrible results. There are a whole bunch of different ways to score inefficiently — which you can even do with a healthy shot distribution like the one the Blazers have.
How about some of the best individual offensive players in the league? Denver plays much differently than Milwaukee, which in turn plays much differently than Oklahoma City. And that makes intuitive sense, because Nikola Jokic, Giannis Antetokounmpo, and Shai Gilgeous-Alexander are not remotely similar players. Their teams play the style of basketball that best fits their skill sets, and they all still find their way to score at a top-10 rate. Similarly, I’ve written before about how, at least in terms of the ways they create offense for themselves and others, Luka Doncic and LeBron James might be the closest thing to each other in the league. And yet, their teams appear to play much differently, and they each are significantly different than, say, the Sixers. And the Sixers would likely be even more different than them both if their star players had been remotely healthy this season.
We can also look at some of the younger teams with highly unusual offenses. Memphis, as mentioned, runs a much different offense than pretty much everyone else in the NBA. But its way of going about things from an offensive theory standpoint is also much different from teams like Orlando and Utah, which are also not similar to each other at all
And honestly, if you’re paying attention, it’s not even all that hard to spot these differences in style of play. If you’re willing to look at anything beyond “everybody shoots threes,” that is.
There are plenty of legitimate critiques of today’s NBA. The idea that everyone in the league is playing the same exact way just isn’t one of them.
Wednesday, December 25, 2024
Sunday, December 22, 2024
Sumner Central Claims : Chat GPT
Overview: Sumner’s Central Claims
Macroeconomics Should Resemble Finance More than Engineering
Sumner contrasts “engineering-style” macro models—those built around mechanical relationships like the Phillips Curve or the Taylor Rule—with a more “finance-style” approach that embraces forward-looking asset prices. Traditional engineering-style models try to pinpoint how specific policy levers (interest rates, fiscal deficits, monetary aggregates) mechanically translate into macro variables like inflation. By contrast, a finance-style perspective acknowledges that markets respond in real time to news about monetary policy’s future stance, making policy expectations central to actual outcomes.
Prediction Is Very Hard, but Ex Post Explanation Is Somewhat Easier
Just as stock prices (Nvidia, Bitcoin) are nearly impossible to forecast accurately, inflation forecasts are famously unreliable. Ex post, however, we can often tell a coherent story: e.g. “AI-fueled demand for Nvidia chips” or “supply chain disruptions + too-easy monetary policy caused inflation.” But because reality is so complex, any “equation-heavy” approach risks failing badly out of sample.
Inflation Dynamics Mostly Reflect Monetary Policy Mistakes
Sumner insists that if a central bank truly wants to keep inflation around 2%, it can do so—on average—over the longer run. Deviations from that 2% path primarily happen because central banks choose (intentionally or accidentally) not to bring inflation back on target. Thus, in his view, “undesirable” demand-driven inflation almost always stems from central bank errors, rather than exogenous shocks or “faults” in capitalism itself.
Level Targeting and “Target the Forecast”
Sumner’s policy advice is to adopt “level targeting” (for prices or NGDP) rather than “growth rate targeting.” Level targeting requires making up for past misses: if inflation runs above target one year, it must come in below target in subsequent years to bring the price level back to its original path. This design encourages stabilizing speculation in financial markets: once the public knows the central bank will always “undo” overshoots or undershoots, asset prices start moving preemptively to push the economy back onto the desired path.
The Fed’s 2021–22 Mistakes and “Asymmetrical” FAIT
Sumner argues the Federal Reserve’s Flexible Average Inflation Targeting (FAIT), introduced in 2020, turned out to be one-sided in practice. The Fed said it would tolerate higher inflation to make up for previous undershoots, but never committed to tolerating below-2% inflation to correct future overshoots. In his telling, this asymmetry (combined with stimulus in 2021–22) explains why inflation overshot so dramatically. If the Fed had adopted a truly symmetrical level targeting rule, it would have tightened earlier, keeping average inflation around 2% over the 2020–2024 window.
Market Forecasts Are the Least Bad Forecasts
Sumner embraces a kind of (qualified) Efficient Market Hypothesis (EMH), suggesting that the best real-time gauge of future inflation (or NGDP growth) is gleaned from market indicators such as TIPS spreads, forward rates, or CPI futures. A central bank that cares about hitting, say, a 2% path for inflation should “target the forecast” by adjusting policy instruments until market indicators imply 2% inflation going forward.
Key Themes in Sumner’s Argument
1. Engineering vs. Finance Perspectives
Engineering-Style Macro
This approach features mechanical relationships:
Phillips Curve: inflation is higher when unemployment is low. But Sumner points out that a simple Phillips Curve fails to explain why inflation stayed subdued in 2019 even at 3.5% unemployment (or why 1933–34 had rising prices with 25% unemployment).
Taylor Rule: interest rates below some “neutral” level ignite inflation. Fine—but then how do we explain near-zero policy rates in the early and mid-2010s with no inflation surge? Quantity Theory: M×V = P×Y. Fine for big historical swings (hyperinflations, 1960s–70s, etc.), but poor for short-run year-to-year guidance. Different definitions of “money” often give conflicting signals. Fiscal Theory: big post-Covid deficits coincided with high inflation; but then why didn’t big deficits in 2015–19 spark inflation? And why did Japan remain deflationary despite high debt? Finance-Style Macro
Here, the focus is on expectations and market-driven asset prices. Because the public knows the central bank is ultimately in the driver’s seat for nominal variables, inflation and NGDP growth become as much about “credible commitments” as about specific policy levers. Sumner’s signature example is the Hong Kong Monetary Authority’s currency peg. HK does not need to “predict” or “explain” every movement in demand for its currency; it simply adjusts the monetary base to maintain a fixed exchange rate. Something similar, Sumner says, could be done with an inflation forecast peg: if the market expects above-2% inflation, the Fed tightens until that forecast returns to 2%, regardless of interest-rate or money-supply changes.
2. Ex Post vs. Ex Ante Complexity
Sumner’s emphasis on the difficulty of forecasting stands in sharp contrast to the relative ease of explaining after the fact. He notes that the very same phenomenon arises in asset prices. The conclusion is that macroeconomists—like stock-market forecasters—should be more humble in their predictions, and focus on policy frameworks that mitigate the damage when something inevitably goes wrong.
3. Role of Policy Mistakes
Under Sumner’s view, one cannot fully “explain” inflation dynamics without highlighting that central banks fail to offset demand shocks. If they target inflation (or NGDP) and faithfully “level target,” demand-side inflation shouldn’t stray too far from goal. In real life, however, policymakers deviate from best practices: they might be swayed by politics (“We must create jobs!”), by institutional inertia (“We’ve never tried pegging a forecast.”), or by poor forecasts (“We trust our model’s projection over TIPS spreads.”). These mistakes accumulate, resulting in persistent overshoots or undershoots.
4. Why Level Targeting Stabilizes
Under true “level targeting,” the central bank commits: “If inflation runs above 2% this year, we will shoot for below 2% next year to return to the original price-level path.” Because the market knows the central bank will do so, interest rates and other asset prices react promptly—long before inflation drifts too high. Traders profit by betting on the eventual correction; in so doing, they enforce the correction earlier. In Hong Kong’s example, speculators know the Monetary Authority will step in whenever the HKD strays from the 7.75–7.85 band. This self-fulfilling stabilizing speculation keeps the HKD dollar stable with relatively few “concrete steps.”
By contrast, “let bygones be bygones” inflation targeting (which is standard) only aims to hit 2% this year, ignoring last year’s overshoot. Under an asymmetrical FAIT, policy overshoots remain in the system, leaving more room for cumulative drift.
5. Why Traditional Models (Often) Disappoint
Sumner takes aim at dissertations titled “Money Demand in Turkey” or “Fiscal Multipliers in Country X.” Such projects typically try to find stable coefficients (e.g., the “multiplier” = 1.2, or the elasticity of money demand = –0.9) in a complex system. But once the political regime, the central bank’s reaction function, or the data sample changes, these relationships break down. Out of sample, these engineering models often fail.
6. A Broader Philosophy-of-Economics Angle
Sumner draws an analogy to Richard Rorty’s take on truth and epistemology. Rorty wasn’t denying that there is truth; he was questioning whether a universal theory of truth was a meaningful endeavor. Likewise, Sumner does not deny that inflation has causes; he denies that a simple macro model—stuffed with equations for deficits, interest rates, money supply, etc.—captures the real source of volatility. For him, the root cause is always: the central bank doesn’t do the symmetrical offsetting it claims to do.
Points of Tension and Critiques
Is Everything Really a Policy Mistake?
Many mainstream economists find it reductive to label every inflation swing as a mistake by the Fed. They point to structural or non-monetary factors (energy price shocks, geopolitical events, supply chain meltdowns). Sumner concedes that supply shocks should lead to some flexible deviation in inflation. But in his view, persistent divergences (e.g., 2021–22) reflect excessive nominal spending growth that the Fed could have reined in, had it chosen a stricter offsetting rule.
Is Financial-Market Forecasting Always “Least Bad”?
Sumner champions the EMH, but critics would note that markets can also overreact or exhibit herding. The puzzle is whether TIPS spreads or other market-based inflation indicators consistently offer better signals than sophisticated in-house models at the Federal Reserve. Sumner’s stance: “It’s not that markets are perfect, but compared to who?” Communication and Political Constraints
Even if one granted that “pegging the inflation (or NGDP) forecast” is the optimal rule, it might be politically (or institutionally) hard for the Fed to break its long-standing practice of using interest rates as the main “concrete step.” Changing to “We are now pegging the real-time inflation forecast” would be a huge conceptual leap for policymakers, journalists, and the broader public.
Excessive Focus on the Nominal at the Expense of the Real?
Tyler Cowen’s critique (as Sumner mentions) sometimes suggests that Sumner’s focus on “the nominal” misses the deep structural forces in the economy—technology, demography, global supply chains. Sumner’s response: “Nominal instability causes cyclical problems. If you want to talk about ‘real’ structural issues, that’s fine, but they don’t directly drive inflation. A central bank can always offset those to keep nominal demand stable.”
Why Sumner’s Perspective Appeals to Some
Logical Consistency + Simple Core Once you accept that the central bank (a) has the last word on nominal variables over the long run and (b) can neutralize demand shocks if it chooses to, it follows that persistent overshoots are policy mistakes. This framework also elegantly explains why certain big fiscal expansions do not cause inflation (the Fed offset them) while other expansions do cause inflation (the Fed tolerated them).
Ex Post Track Record
Sumner points to examples like 2012–13, when many Keynesians predicted looming recession from “fiscal austerity.” Financial markets were not pricing in gloom, and Sumner argued the Fed would offset austerity. Indeed, no double dip ensued and growth actually picked up. He admits there is luck involved, but the underlying logic—“The Fed offsets demand shocks”—still resonates with younger economists who see too many “missed predictions” from conventional models.
Market Monetarism’s Emphasis on NGDP Much of Sumner’s alternative approach is spelled out in his work on “market monetarism,” which contends that stabilizing nominal GDP along a level path is usually superior to pure inflation targeting. This stems from the notion that stable nominal spending helps avoid both deflationary slumps and runaway inflation, mitigating many real-world business-cycle pathologies.
Concluding Thoughts
Sumner’s post aims to clarify why his “market-driven” approach to inflation—and to macro more generally—differs from standard Phillips Curve or Taylor Principle narratives. He does not claim to offer a formula to predict inflation. Instead, he claims:
Excess inflation comes from the central bank choosing (or erring) not to rein in nominal spending growth. We can better stabilize inflation (and output) if we adopt level targeting and allow real-time market forecasts to guide policy. Macroeconomic forecasting is inherently unreliable, so we should focus on robust policy rules that reduce the cost of being wrong. Ultimately, Sumner wants to push macro toward what he calls a “finance perspective”: target your policy goal the way a currency board pegs an exchange rate, harnessing stabilizing speculation in the process. His view is that the “engineering” perspective—stuffing hundreds of ad hoc relationships into a big model—over-promises and under-delivers when it comes to explaining (and especially predicting) real-world inflation volatility.
Where many see an unsolved puzzle (“Why did inflation jump here but not there?”), Sumner sees repeated central bank “own goals.” He believes that any workable “model of inflation dynamics” must start with the premise: “Central bankers deviate from a symmetrical level target and fail to offset demand shocks.” Everything else is minor detail.
Whether one finds this framework too simplistic or refreshingly parsimonious, Sumner’s arguments have influenced a generation of “market monetarists.” While mainstream economists still focus on standard interest-rate policy and Phillips-curve-style analysis, Sumner continues to press for a fundamental overhaul—one that, he hopes, might one day make macro as “boring and predictable” as a currency peg or a strict nominal target.
Macroeconomics Should Resemble Finance More than Engineering
Sumner contrasts “engineering-style” macro models—those built around mechanical relationships like the Phillips Curve or the Taylor Rule—with a more “finance-style” approach that embraces forward-looking asset prices. Traditional engineering-style models try to pinpoint how specific policy levers (interest rates, fiscal deficits, monetary aggregates) mechanically translate into macro variables like inflation. By contrast, a finance-style perspective acknowledges that markets respond in real time to news about monetary policy’s future stance, making policy expectations central to actual outcomes.
Prediction Is Very Hard, but Ex Post Explanation Is Somewhat Easier
Just as stock prices (Nvidia, Bitcoin) are nearly impossible to forecast accurately, inflation forecasts are famously unreliable. Ex post, however, we can often tell a coherent story: e.g. “AI-fueled demand for Nvidia chips” or “supply chain disruptions + too-easy monetary policy caused inflation.” But because reality is so complex, any “equation-heavy” approach risks failing badly out of sample.
Inflation Dynamics Mostly Reflect Monetary Policy Mistakes
Sumner insists that if a central bank truly wants to keep inflation around 2%, it can do so—on average—over the longer run. Deviations from that 2% path primarily happen because central banks choose (intentionally or accidentally) not to bring inflation back on target. Thus, in his view, “undesirable” demand-driven inflation almost always stems from central bank errors, rather than exogenous shocks or “faults” in capitalism itself.
Level Targeting and “Target the Forecast”
Sumner’s policy advice is to adopt “level targeting” (for prices or NGDP) rather than “growth rate targeting.” Level targeting requires making up for past misses: if inflation runs above target one year, it must come in below target in subsequent years to bring the price level back to its original path. This design encourages stabilizing speculation in financial markets: once the public knows the central bank will always “undo” overshoots or undershoots, asset prices start moving preemptively to push the economy back onto the desired path.
The Fed’s 2021–22 Mistakes and “Asymmetrical” FAIT
Sumner argues the Federal Reserve’s Flexible Average Inflation Targeting (FAIT), introduced in 2020, turned out to be one-sided in practice. The Fed said it would tolerate higher inflation to make up for previous undershoots, but never committed to tolerating below-2% inflation to correct future overshoots. In his telling, this asymmetry (combined with stimulus in 2021–22) explains why inflation overshot so dramatically. If the Fed had adopted a truly symmetrical level targeting rule, it would have tightened earlier, keeping average inflation around 2% over the 2020–2024 window.
Market Forecasts Are the Least Bad Forecasts
Sumner embraces a kind of (qualified) Efficient Market Hypothesis (EMH), suggesting that the best real-time gauge of future inflation (or NGDP growth) is gleaned from market indicators such as TIPS spreads, forward rates, or CPI futures. A central bank that cares about hitting, say, a 2% path for inflation should “target the forecast” by adjusting policy instruments until market indicators imply 2% inflation going forward.
Key Themes in Sumner’s Argument
1. Engineering vs. Finance Perspectives
Engineering-Style Macro
This approach features mechanical relationships:
Phillips Curve: inflation is higher when unemployment is low. But Sumner points out that a simple Phillips Curve fails to explain why inflation stayed subdued in 2019 even at 3.5% unemployment (or why 1933–34 had rising prices with 25% unemployment).
Taylor Rule: interest rates below some “neutral” level ignite inflation. Fine—but then how do we explain near-zero policy rates in the early and mid-2010s with no inflation surge? Quantity Theory: M×V = P×Y. Fine for big historical swings (hyperinflations, 1960s–70s, etc.), but poor for short-run year-to-year guidance. Different definitions of “money” often give conflicting signals. Fiscal Theory: big post-Covid deficits coincided with high inflation; but then why didn’t big deficits in 2015–19 spark inflation? And why did Japan remain deflationary despite high debt? Finance-Style Macro
Here, the focus is on expectations and market-driven asset prices. Because the public knows the central bank is ultimately in the driver’s seat for nominal variables, inflation and NGDP growth become as much about “credible commitments” as about specific policy levers. Sumner’s signature example is the Hong Kong Monetary Authority’s currency peg. HK does not need to “predict” or “explain” every movement in demand for its currency; it simply adjusts the monetary base to maintain a fixed exchange rate. Something similar, Sumner says, could be done with an inflation forecast peg: if the market expects above-2% inflation, the Fed tightens until that forecast returns to 2%, regardless of interest-rate or money-supply changes.
2. Ex Post vs. Ex Ante Complexity
Sumner’s emphasis on the difficulty of forecasting stands in sharp contrast to the relative ease of explaining after the fact. He notes that the very same phenomenon arises in asset prices. The conclusion is that macroeconomists—like stock-market forecasters—should be more humble in their predictions, and focus on policy frameworks that mitigate the damage when something inevitably goes wrong.
3. Role of Policy Mistakes
Under Sumner’s view, one cannot fully “explain” inflation dynamics without highlighting that central banks fail to offset demand shocks. If they target inflation (or NGDP) and faithfully “level target,” demand-side inflation shouldn’t stray too far from goal. In real life, however, policymakers deviate from best practices: they might be swayed by politics (“We must create jobs!”), by institutional inertia (“We’ve never tried pegging a forecast.”), or by poor forecasts (“We trust our model’s projection over TIPS spreads.”). These mistakes accumulate, resulting in persistent overshoots or undershoots.
4. Why Level Targeting Stabilizes
Under true “level targeting,” the central bank commits: “If inflation runs above 2% this year, we will shoot for below 2% next year to return to the original price-level path.” Because the market knows the central bank will do so, interest rates and other asset prices react promptly—long before inflation drifts too high. Traders profit by betting on the eventual correction; in so doing, they enforce the correction earlier. In Hong Kong’s example, speculators know the Monetary Authority will step in whenever the HKD strays from the 7.75–7.85 band. This self-fulfilling stabilizing speculation keeps the HKD dollar stable with relatively few “concrete steps.”
By contrast, “let bygones be bygones” inflation targeting (which is standard) only aims to hit 2% this year, ignoring last year’s overshoot. Under an asymmetrical FAIT, policy overshoots remain in the system, leaving more room for cumulative drift.
5. Why Traditional Models (Often) Disappoint
Sumner takes aim at dissertations titled “Money Demand in Turkey” or “Fiscal Multipliers in Country X.” Such projects typically try to find stable coefficients (e.g., the “multiplier” = 1.2, or the elasticity of money demand = –0.9) in a complex system. But once the political regime, the central bank’s reaction function, or the data sample changes, these relationships break down. Out of sample, these engineering models often fail.
6. A Broader Philosophy-of-Economics Angle
Sumner draws an analogy to Richard Rorty’s take on truth and epistemology. Rorty wasn’t denying that there is truth; he was questioning whether a universal theory of truth was a meaningful endeavor. Likewise, Sumner does not deny that inflation has causes; he denies that a simple macro model—stuffed with equations for deficits, interest rates, money supply, etc.—captures the real source of volatility. For him, the root cause is always: the central bank doesn’t do the symmetrical offsetting it claims to do.
Points of Tension and Critiques
Is Everything Really a Policy Mistake?
Many mainstream economists find it reductive to label every inflation swing as a mistake by the Fed. They point to structural or non-monetary factors (energy price shocks, geopolitical events, supply chain meltdowns). Sumner concedes that supply shocks should lead to some flexible deviation in inflation. But in his view, persistent divergences (e.g., 2021–22) reflect excessive nominal spending growth that the Fed could have reined in, had it chosen a stricter offsetting rule.
Is Financial-Market Forecasting Always “Least Bad”?
Sumner champions the EMH, but critics would note that markets can also overreact or exhibit herding. The puzzle is whether TIPS spreads or other market-based inflation indicators consistently offer better signals than sophisticated in-house models at the Federal Reserve. Sumner’s stance: “It’s not that markets are perfect, but compared to who?” Communication and Political Constraints
Even if one granted that “pegging the inflation (or NGDP) forecast” is the optimal rule, it might be politically (or institutionally) hard for the Fed to break its long-standing practice of using interest rates as the main “concrete step.” Changing to “We are now pegging the real-time inflation forecast” would be a huge conceptual leap for policymakers, journalists, and the broader public.
Excessive Focus on the Nominal at the Expense of the Real?
Tyler Cowen’s critique (as Sumner mentions) sometimes suggests that Sumner’s focus on “the nominal” misses the deep structural forces in the economy—technology, demography, global supply chains. Sumner’s response: “Nominal instability causes cyclical problems. If you want to talk about ‘real’ structural issues, that’s fine, but they don’t directly drive inflation. A central bank can always offset those to keep nominal demand stable.”
Why Sumner’s Perspective Appeals to Some
Logical Consistency + Simple Core Once you accept that the central bank (a) has the last word on nominal variables over the long run and (b) can neutralize demand shocks if it chooses to, it follows that persistent overshoots are policy mistakes. This framework also elegantly explains why certain big fiscal expansions do not cause inflation (the Fed offset them) while other expansions do cause inflation (the Fed tolerated them).
Ex Post Track Record
Sumner points to examples like 2012–13, when many Keynesians predicted looming recession from “fiscal austerity.” Financial markets were not pricing in gloom, and Sumner argued the Fed would offset austerity. Indeed, no double dip ensued and growth actually picked up. He admits there is luck involved, but the underlying logic—“The Fed offsets demand shocks”—still resonates with younger economists who see too many “missed predictions” from conventional models.
Market Monetarism’s Emphasis on NGDP Much of Sumner’s alternative approach is spelled out in his work on “market monetarism,” which contends that stabilizing nominal GDP along a level path is usually superior to pure inflation targeting. This stems from the notion that stable nominal spending helps avoid both deflationary slumps and runaway inflation, mitigating many real-world business-cycle pathologies.
Concluding Thoughts
Sumner’s post aims to clarify why his “market-driven” approach to inflation—and to macro more generally—differs from standard Phillips Curve or Taylor Principle narratives. He does not claim to offer a formula to predict inflation. Instead, he claims:
Excess inflation comes from the central bank choosing (or erring) not to rein in nominal spending growth. We can better stabilize inflation (and output) if we adopt level targeting and allow real-time market forecasts to guide policy. Macroeconomic forecasting is inherently unreliable, so we should focus on robust policy rules that reduce the cost of being wrong. Ultimately, Sumner wants to push macro toward what he calls a “finance perspective”: target your policy goal the way a currency board pegs an exchange rate, harnessing stabilizing speculation in the process. His view is that the “engineering” perspective—stuffing hundreds of ad hoc relationships into a big model—over-promises and under-delivers when it comes to explaining (and especially predicting) real-world inflation volatility.
Where many see an unsolved puzzle (“Why did inflation jump here but not there?”), Sumner sees repeated central bank “own goals.” He believes that any workable “model of inflation dynamics” must start with the premise: “Central bankers deviate from a symmetrical level target and fail to offset demand shocks.” Everything else is minor detail.
Whether one finds this framework too simplistic or refreshingly parsimonious, Sumner’s arguments have influenced a generation of “market monetarists.” While mainstream economists still focus on standard interest-rate policy and Phillips-curve-style analysis, Sumner continues to press for a fundamental overhaul—one that, he hopes, might one day make macro as “boring and predictable” as a currency peg or a strict nominal target.
Saturday, December 21, 2024
Multiple Drone Incursions Confirmed over Marine Corps Base Camp
Marine Corps Base Camp Pendleton in southern California experienced multiple drone incursions over its airspace the past several days, a facility spokesman told The War Zone on Tuesday morning.
Between Dec. 9 and 15, “there were six instances of unmanned aerial systems (UAS) observed entering Camp Pendleton’s airspace, with no threat to installation operations,” Capt. James C. Sartain, a base spokesman, told The War Zone in response to our query on the matter.
Sartain could not immediately provide details about how many drones flew over the installation, their origin, what actions were taken in response and if any air or ground operations were affected as a result of the incursions. We have asked for these details and will update this story should any be provided
Located in north San Diego County, MCB Camp Pendleton is the Marine Corps’ major west coast training facility. It is home to the 1st Marine Expeditionary Force, 1st Marine Division, 1st Marine Logistics Group, elements of the 3rd Marine Aircraft Wing, and several other tenant units including Marine Corps Air Station Camp Pendleton, ACU-5, Naval Hospital, Marine Corps Tactical Systems Support Activity (MCTSSA), Weapons Field Training Battalion, Naval Weapons Station Fallbrook, and Deployment Processing Command/ Reserve Support Unit – West.
This is just the latest in a growing string of incursions reported over U.S. military installations at home and abroad.
News about MCB Camp Pendleton follows a drone incursion over Wright-Patterson Air Force Base in Ohio forcing the closure of its airspace Friday night into Saturday morning. The War Zone was the first to report that incident. On Tuesday, Wright-Patterson officials announced that there were additional incursions over the facility.
“Small unmanned aerial systems were spotted in the vicinity of and over Wright-Patterson AFB’s Area A and Area B during the late evening and early morning of Dec. 16 and Dec. 17th,” the 88th Air Base Wing said in a statement. “Installation leaders have determined that none of the incursions impacted base residents, facilities or assets. The Wright-Patterson AFB airspace was not affected by the incursions.”
“The number of systems has fluctuated, and they have ranged in sizes and configurations,” the release added. “Our units continue to monitor the airspace and are working with local law enforcement authorities and mission partners to ensure the safety of base personnel, facilities and assets. We request individuals in the area to contact either local police or Security Forces if they see anything suspicious, to include sUAS’s or drone activity.”
This all comes amid a growing frenzy about drones that began when they appeared over Picatinny Arsenal in New Jersey on Nov. 18, which we were the first to report. Since then, the FBI said more than 5,000 reports of drone sightings came into its drone hotline, of which fewer than 100 merited further investigation.
On Monday night, the Pentagon, FBI and Department of Homeland Security issued a statement that they have “not identified anything anomalous and do not assess the activity to date to present a national security or public safety risk over the civilian airspace in New Jersey or other states in the northeast.”
The full statement is below:
“There are more than one million drones lawfully registered with the FAA in the United States and there are thousands of commercial, hobbyist and law enforcement drones lawfully in the sky on any given day. With the technology landscape evolving, we expect that number to increase over time.
FBI has received tips of more than 5,000 reported drone sightings in the last few weeks with approximately 100 leads generated, and the federal government is supporting state and local officials in investigating these reports. Consistent with each of our unique missions and authorities, we are quickly working to prioritize and follow these leads. We have sent advanced detection technology to the region. And we have sent trained visual observers.
Having closely examined the technical data and tips from concerned citizens, we assess that the sightings to date include a combination of lawful commercial drones, hobbyist drones, and law enforcement drones, as well as manned fixed-wing aircraft, helicopters, and stars mistakenly reported as drones. We have not identified anything anomalous and do not assess the activity to date to present a national security or public safety risk over the civilian airspace in New Jersey or other states in the northeast.
That said, we recognize the concern among many communities. We continue to support state and local authorities with advanced detection technology and support of law enforcement. We urge Congress to enact counter-UAS legislation when it reconvenes that would extend and expand existing counter-drone authorities to identify and mitigate any threat that may emerge.
Additionally, there have been a limited number of visual sightings of drones over military facilities in New Jersey and elsewhere, including within restricted air space. Such sightings near or over DoD installations are not new. DoD takes unauthorized access over its airspace seriously and coordinates closely with federal, state, and local law enforcement authorities, as appropriate. Local commanders are actively engaged to ensure there are appropriate detection and mitigation measures in place.”
However, public furor has become so concerning that the FBI and New Jersey State Police last night issued a plea for people to not fire lasers or bullets at anything in the sky.
“We are seeing an increase of pilots of manned aircraft being hit in the eyes with lasers as people on the ground think they see a drone,” cautioned Nelson Delgado, Acting Special Agent in Charge of the FBI Newark Field Office, which is leading the drone investigation. “We are also concerned that people will take matters into their own hands and fire a weapon at an aircraft. Not only is this act against the law, but it poses an incredible danger to the pilots and passengers on those aircraft.”
“Whatever your beliefs are,” Delgado added, “putting someone else’s life in danger is not the answer.”
The FBI warning came after pilots of 15 fixed and rotary wing aircraft from Joint Base McGuire-Dix-Lakehurst reported being struck by lasers from the ground since Dec. 7, Capt. Kitsana R. Douglomachan told The War Zone. One of those pilots had to seek medical treatment but was quickly released. All aircraft landed safely, he said, adding that officials do not know who fired the lasers. NJ.com was the first to report these incidents.
Monday night, Hill Air Force Base became the latest installation experiencing drone incursions.
“We can confirm that unmanned aerial systems were spotted in the vicinity of Hill AFB recently,” a spokesperson told KUTV–2 news Monday night. “To date, unmanned aerial systems have not impacted Hill AFB operations and all appropriate measures are being taken to safeguard Hill AFB personnel, assets, and infrastructure.”
By Howard Altman in The Warzone
U.S. officials are still trying to discover the origin of drones that appeared over four U.S. Air Force bases in the U.K., another story we first broke. They’ve been spotted over RAF Lakenheath, RAF Mildenhall, and RAF Feltwell, all within close proximity, and RAF Fairford, about 130 miles to the west. A few days earlier, Ramstein Air Base in Germany joined the growing list of places registering unknown drone overflights.
TWZ has been on the leading edge of covering this topic for years and has broken multiple stories now about drone incursions over key U.S. bases and training ranges, as well as uncrewed aerial systems harassing American forces off the coasts of the United States and making worrisome overflights of important non-military sites. A spate of drone incursions over Langley Air Force Base in Virginia in December 2023, which TWZ was the first report on, has now become a particular focal point of concern about these instances. While authorities are downplaying the majority of public drone sightings, they acknowledge a real concern about those flying over military installations, which as we have noted, forced the closure of airspace over one of those facilities.
Update: 4:37 PM Eastern – MCB Camp Pendleton responded to our additional questions about the drones spotted over the facility. “Each instance of observance occurs when an individual, via line of sight, observes a suspected unmanned aerial system (UAS) perceived to be in Camp Pendleton air space. Based on observation for all six instances, deploying countermeasures was not necessary, and air and ground operations were not impacted.” The Pentagon’s top spokesman on Tuesday acknowledged that some of the drones flying over U.S. military bases could have been up to no good and shed additional light on the counter-drone capabilities being sent to two facilities in New Jersey.
“Is it possible that some of those drones could be up to malign activity? That’s entirely possible, but in the vast majority that is not the case,” Air Force Maj. Gen. Pat Ryder told reporters, including from The War Zone. “When we detect them, [we] attempt to classify them and take appropriate measures,” he added. “Is it possible that some of those are surveilling? Absolutely. But can you make that assumption in every case? Not necessarily so in each case.
Installation commanders “have the authority to respond appropriately, and we’ll continue to do that.” As mentioned earlier in this story, both Picatinny Arsenal and Naval Weapons Station Earle will be receiving equipment to help them track and, if necessary, defeat drones. During his presser, Ryder offered new details of those capabilities.
“In addition to some of the capabilities that are already on these installations, these capabilities essentially will enhance a base authority’s ability to detect, identify and track UASs,” Ryder explained. “So for example, this could include active or passive detection capabilities, plus capabilities like the system known as Dronebuster, which employs non-kinetic means to interrupt drone signals and affect their ability to operate.” In a follow-up exchange with The War Zone, Ryder confirmed that Picatinny will be receiving the Dronebuster equipment. It’s a man-portable, radio-frequency jamming system with a pistol grip made by a company called Flex Force.
The Pentagon “doesn’t see a connection between the drone sightings over military bases in the U.S. and overseas, Ryder said. When we pushed him to tell us the visual and sensor similarities between these sightings, he declined to answer. Ryder reiterated a point made yesterday that the drones being reported are not connected to the military. They are not associated with the National Aerospace Research and Technology Park (NARTP) in southern New Jersey that develops and tests drones, he added.
U.S. officials are still trying to discover the origin of drones that appeared over four U.S. Air Force bases in the U.K., another story we first broke. They’ve been spotted over RAF Lakenheath, RAF Mildenhall, and RAF Feltwell, all within close proximity, and RAF Fairford, about 130 miles to the west. A few days earlier, Ramstein Air Base in Germany joined the growing list of places registering unknown drone overflights.
TWZ has been on the leading edge of covering this topic for years and has broken multiple stories now about drone incursions over key U.S. bases and training ranges, as well as uncrewed aerial systems harassing American forces off the coasts of the United States and making worrisome overflights of important non-military sites. A spate of drone incursions over Langley Air Force Base in Virginia in December 2023, which TWZ was the first report on, has now become a particular focal point of concern about these instances. While authorities are downplaying the majority of public drone sightings, they acknowledge a real concern about those flying over military installations, which as we have noted, forced the closure of airspace over one of those facilities.
Update: 4:37 PM Eastern – MCB Camp Pendleton responded to our additional questions about the drones spotted over the facility. “Each instance of observance occurs when an individual, via line of sight, observes a suspected unmanned aerial system (UAS) perceived to be in Camp Pendleton air space. Based on observation for all six instances, deploying countermeasures was not necessary, and air and ground operations were not impacted.” The Pentagon’s top spokesman on Tuesday acknowledged that some of the drones flying over U.S. military bases could have been up to no good and shed additional light on the counter-drone capabilities being sent to two facilities in New Jersey.
“Is it possible that some of those drones could be up to malign activity? That’s entirely possible, but in the vast majority that is not the case,” Air Force Maj. Gen. Pat Ryder told reporters, including from The War Zone. “When we detect them, [we] attempt to classify them and take appropriate measures,” he added. “Is it possible that some of those are surveilling? Absolutely. But can you make that assumption in every case? Not necessarily so in each case.
Installation commanders “have the authority to respond appropriately, and we’ll continue to do that.” As mentioned earlier in this story, both Picatinny Arsenal and Naval Weapons Station Earle will be receiving equipment to help them track and, if necessary, defeat drones. During his presser, Ryder offered new details of those capabilities.
“In addition to some of the capabilities that are already on these installations, these capabilities essentially will enhance a base authority’s ability to detect, identify and track UASs,” Ryder explained. “So for example, this could include active or passive detection capabilities, plus capabilities like the system known as Dronebuster, which employs non-kinetic means to interrupt drone signals and affect their ability to operate.” In a follow-up exchange with The War Zone, Ryder confirmed that Picatinny will be receiving the Dronebuster equipment. It’s a man-portable, radio-frequency jamming system with a pistol grip made by a company called Flex Force.
The Pentagon “doesn’t see a connection between the drone sightings over military bases in the U.S. and overseas, Ryder said. When we pushed him to tell us the visual and sensor similarities between these sightings, he declined to answer. Ryder reiterated a point made yesterday that the drones being reported are not connected to the military. They are not associated with the National Aerospace Research and Technology Park (NARTP) in southern New Jersey that develops and tests drones, he added.
NBA's problem is Economics and not Basketball
By Tyler Cowen in Bloomberg
The NBA seems to be having some trouble. This season’s TV ratings are either down precipitously or struggling to hold even, a shift too dramatic to be explained by cord-cutting alone, and meanwhile the NFL is doing fine. Tickets for the NBA Cup, the finals of which were Tuesday night (congratulations, Giannis!), went for half of what they did last year.
Part of the explanation may be that both teams are from small markets — Milwaukee and Oklahoma City — and part may be that fans have yet to figure out why they should care about this midseason tournament. The most plausible explanation, however, is economic
The NBA has a salary cap, which prevents teams in major markets, such as the Los Angeles Lakers or New York Knicks, from snapping up all the talent. An unfortunate side effect is that it is harder for all teams to bid for additional players, or to keep the ones they have. Even when the total amount of the cap goes up, adding more talent at the margin has become increasingly costly in terms of penalties. It is becoming more difficult to form and maintain durable great teams, which makes it harder to elevate new superstars, which is what many fans want.
Think about a casual fan’s impressions of the NBA. They have heard of Michael Jordan and LeBron James, and maybe watched them play or even seen their movies. They know they are two of the all-time greats. The 27th-best player over the that same time span — whoever it may be — is extremely accomplished, but does not attract anything close to the same attention. Superstars are what the game and its popularity are about, most of all with the marginal fans who do not know every player.
It is not surprising that one of the best-known players today, with more than 8 million Instagram followers, is Bronny James, son of LeBron. Bronny has barely played in the NBA and is far from a star; his popularity stems from his family story.
Jordan won six rings and LeBron four, but who is to follow in their footsteps and be the game’s marquee player? One candidate was Nikola Jokic, center for the Denver Nuggets and three-time MVP. Given his extraordinary statistics, he is in the running to be MVP again this year.
His team is another story. The Nuggets won an NBA title in 2023, but since then they have been in free fall. They let some of their key rotation players leave, most of all because of the salary cap. If they had kept those players around, or brought in star replacements, the Nuggets would have had to pay large fines to the league. Denver is a relatively small basketball market, so it made more sense to let the players walk. Jokic thus might retire with only one ring, when he could have three or four and become a truly iconic star.
Giannis Antetokounmpo was another contender to be the face of the game, and in addition to Tuesday night’s NBA Cup, his Milwaukee Bucks won a championship in 2021. Since then, they have not come close to winning again, or to patching up their limitations.
In fact, over the last six years, six teams have won the league title. Compare that to the 1990s, when Jordan’s Chicago Bulls won six titles, or to the 1980s, when the Celtics and Lakers between them won eight titles, with numerous iconic matchups against each other along the way. Tim Duncan’s San Antonio Spurs, Kobe Bryant’s (and Shaquille O’Neal’s) Lakers and Steph Curry’s Golden State Warriors all produced dominant teams with top stars.
Of course, there are also problems with the product itself. Regular-season games don’t mean very much, and the median outing is too often mediocre. Optimizing players no longer give their best in these settings. Due to basketball analytics, too many 3-point shots are taken. What was originally a source of excitement has become routinized and predictable. And perhaps American fans don’t relate as well to the growing number of foreign players and stars.
Yet there have always been issues with the product. In the 1990s, for example, there were far too many ugly fouls, and some say expansion diluted competition. The league’s main problem now is systemic. The NBA needs to remember that competitive parity is unsatisfying, and that we live in a celebrity culture. It’s all about who is going to be the next No. 1 — and the league needs to let market forces have a greater say.
The NBA seems to be having some trouble. This season’s TV ratings are either down precipitously or struggling to hold even, a shift too dramatic to be explained by cord-cutting alone, and meanwhile the NFL is doing fine. Tickets for the NBA Cup, the finals of which were Tuesday night (congratulations, Giannis!), went for half of what they did last year.
Part of the explanation may be that both teams are from small markets — Milwaukee and Oklahoma City — and part may be that fans have yet to figure out why they should care about this midseason tournament. The most plausible explanation, however, is economic
The NBA has a salary cap, which prevents teams in major markets, such as the Los Angeles Lakers or New York Knicks, from snapping up all the talent. An unfortunate side effect is that it is harder for all teams to bid for additional players, or to keep the ones they have. Even when the total amount of the cap goes up, adding more talent at the margin has become increasingly costly in terms of penalties. It is becoming more difficult to form and maintain durable great teams, which makes it harder to elevate new superstars, which is what many fans want.
Think about a casual fan’s impressions of the NBA. They have heard of Michael Jordan and LeBron James, and maybe watched them play or even seen their movies. They know they are two of the all-time greats. The 27th-best player over the that same time span — whoever it may be — is extremely accomplished, but does not attract anything close to the same attention. Superstars are what the game and its popularity are about, most of all with the marginal fans who do not know every player.
It is not surprising that one of the best-known players today, with more than 8 million Instagram followers, is Bronny James, son of LeBron. Bronny has barely played in the NBA and is far from a star; his popularity stems from his family story.
Jordan won six rings and LeBron four, but who is to follow in their footsteps and be the game’s marquee player? One candidate was Nikola Jokic, center for the Denver Nuggets and three-time MVP. Given his extraordinary statistics, he is in the running to be MVP again this year.
His team is another story. The Nuggets won an NBA title in 2023, but since then they have been in free fall. They let some of their key rotation players leave, most of all because of the salary cap. If they had kept those players around, or brought in star replacements, the Nuggets would have had to pay large fines to the league. Denver is a relatively small basketball market, so it made more sense to let the players walk. Jokic thus might retire with only one ring, when he could have three or four and become a truly iconic star.
Giannis Antetokounmpo was another contender to be the face of the game, and in addition to Tuesday night’s NBA Cup, his Milwaukee Bucks won a championship in 2021. Since then, they have not come close to winning again, or to patching up their limitations.
In fact, over the last six years, six teams have won the league title. Compare that to the 1990s, when Jordan’s Chicago Bulls won six titles, or to the 1980s, when the Celtics and Lakers between them won eight titles, with numerous iconic matchups against each other along the way. Tim Duncan’s San Antonio Spurs, Kobe Bryant’s (and Shaquille O’Neal’s) Lakers and Steph Curry’s Golden State Warriors all produced dominant teams with top stars.
Of course, there are also problems with the product itself. Regular-season games don’t mean very much, and the median outing is too often mediocre. Optimizing players no longer give their best in these settings. Due to basketball analytics, too many 3-point shots are taken. What was originally a source of excitement has become routinized and predictable. And perhaps American fans don’t relate as well to the growing number of foreign players and stars.
Yet there have always been issues with the product. In the 1990s, for example, there were far too many ugly fouls, and some say expansion diluted competition. The league’s main problem now is systemic. The NBA needs to remember that competitive parity is unsatisfying, and that we live in a celebrity culture. It’s all about who is going to be the next No. 1 — and the league needs to let market forces have a greater say.
Wednesday, December 18, 2024
Saturday, December 14, 2024
Bringing Elon to a knife fight
By Jennifer Pahlka in Eatingpolicy substack
DOGE has made it both impossible not to talk about government reform, and impossible to talk about it. The topic is everywhere, but the subject is now entirely eclipsed on the left by the horror of who has been assigned the task and the need to decry DOGE as a bad faith effort. Elon wants to launch his rockets without government interference, Vivek wants to gut the civil service, both want to cater to cronies. I am being told on the socials that anyone engaging in discussion of how to shape this effort or what good it could potentially bring is enabling the ambitions of an autocracy. The problem itself, barely legible to Dems before DOGE, is off the table again.
But we do need to talk about government reform, and while I’m sorry the conditions are quite a bit less than ideal, I think it's time we admitted they were always going to be. Democrats did not do this work. Many wonderful public servants made valiant efforts and scored some great wins, but Democratic leadership did not make it a top priority to clear out the underbrush that jams the gears of government. Elon’s ambitions should not serve as cover for Dems to continue to abdicate responsibility here. Until we know more about what DOGE is planning, I support Dems like Ro Khanna for pledging to work with them.
I am guessing that those most worried that DOGE will succeed have never tried their hand at reforming government. It’s hard. But easier, you say, with no respect for the law, and the DOGE team will be unencumbered by such details. But that’s not true. The lawsuits will come. A lot of the government tech community is skipping the hand wringing; they've basically just grabbed a bag of popcorn and are watching in real time as Elon and Vivek learn all the things they’ve known, lived, and absolutely hated for their entire time in public service. They don’t see DOGE as their savior, but they are feeling vindicated after years of shouting into the void. I am struck by how different the tone of the DOGE conversation is between political leaders on the left and the people who’ve been fighting in the implementation trenches. One group is terrified they’ll succeed. The other is starting to ask a surprising question (or at least I am): What if even billionaires can’t disrupt the system we have built?
Take the issue of respect for the law. Put aside the headline grabbing issues for a second and live in the mundane world of implementation in government. If you’ve spent the past ten years trying to make, say, better online services for veterans, or clearer ways to understand your Medicare benefits, or even better ways to support warfighters, you’ve sat in countless -– and I mean countless — meetings where you’ve been told that something you were trying to do was illegal. Was it? Now, instead of launching your new web form or doing the user research your team needed to do, you spend weeks researching why you are now branded as dangerously lawless, only to find that either a) it was absolutely not illegal but 25 years ago someone wrote a memo that has since been interpreted as advising against this thing, b) no one had heard of the thing you were trying to do (the cloud, user research, A/B testing) and didn’t understand what you were talking about so had simply asserted it was illegal out of fear, c) there was an actual provision in law somewhere that did seem to address this and interpreting it required understanding both the actual intent of the law and the operational mechanics of the thing you were trying to do, which actually matched up pretty well or d) (and this one is uncommon) that the basic, common sense thing you were trying to do was actually illegal, which was clearly the result of a misunderstanding by policymakers or the people who draft legislation and policy on their behalf, and if they understood how their words had been operationalized, they’d be horrified. It is absolutely possible to both respect the rule of law, considering the democratic process and the peaceful transfer of power sacred, and have developed an aversion to the fetishization of law that perverts its intent. The majority of public servants I know have well earned this right.
DOGE is about to crash into this wall of weaponization of the complexities of law, policy, regulation, process, and lore in defense of the status quo and yes, the people in my community are watching. While our eyes are on potential abuses, they are also on the durability of the wall generally, and with deeply mixed emotions. It must be said: the wall is a problem. It is a problem for people who value the rule of law. It is a problem for people who care about an effective, responsive government
A few kind and well-meaning folks have suggested that I should be in charge of DOGE. The way they see it, if only someone with my values and knowledge of the situation were given the same power that Elon and Vivek have been given this would all be good. Yes, I have spent fifteen years studying roughly the same problem DOGE is now attacking (framed differently, but even I have framed it differently over the years.) But a wish for someone like me to be in charge misunderstands the nature of the problem. Diagnosis we have. The power to change we do not. Billionaires are in charge now because they have power. Elon in particular has what Ezra Klein correctly ascribed to Trump, which is lack of inhibition. Normal people like me get scared and ashamed when we’re told we’re doing something illegal. Elon does not. I wish it were different, but perhaps the job of breaking the wall has ended up with someone who is suited to doing it. The norms that constrain even people with far more power than I have make it very difficult to break through that wall.
It’s not that Dems haven’t tried the billionaire move. Obama’s Secretary of Defense Ash Carter started the Defense Innovation Board and appointed Eric Schmidt to lead it, for example. Carter’s hope was to transform the military into a modern, effective (and dare I say efficient) institution, and saw Schmidt as extra firepower for change, so to speak. Many grumped about the unaccountable power Schmidt would have, but I served on that board for four years and that concern was laughable. The DIB did a lot of great work that Eric should be proud of (I am), but it was merely advisory, set up according to Federal Advisory Committee Act rules. It didn’t help that the board quickly attracted at least one gadfly ethics-watcher, whose objections to Eric’s actions were so nitpicky and procedural she represented the worst of the Pentagon, the worst of public service, the worst of accountability. In the end, she was an annoying speed bump, but it didn’t matter. We wrote papers that said things like “the Pentagon already knows everything in this paper, but seems unable to act on it,” and watched as the Pentagon appeared to technically act on recommendations without fundamentally changing much. Other efforts like DIUx made real progress, but the dent in the machine is still tiny. I admire what Eric did on DIB. But the presence of a billionaire is certainly no guarantee of change. Mike Bloomberg has the job now, and is still proving the point. Today, we spend even more to get even less deterrence.
It's really hard to have an accurate model for why change is so hard in large bureaucratic institutions, and specifically for public sector ones, where the differences in governance really do matter. On the one hand, I do still believe that the first order problem is simply lack of attention by people with power. If politics and policy take their fair share of your oxygen, there's really just very little left for the implementation. What is available gets used on getting that particular thing done, which usually means a hack around the system instead of permanently changing it so it can be easy next time. At its worst, the hacks to get it done this time actually make it harder on an ongoing basis.
But it's not as if mere attention would solve the problem. There are entrenched interests for the status quo. It's easy to imagine these as exclusively or even mostly commercial interests, but if that were true, why would it take three years to issue guidance as anodyne as the hiring memo the Biden Administration put out this summer? Among other provisions, it declared job titles could now be listed accurately (you could say it was a software developer instead of an IT specialist) and that you could now share “certs” of candidates who qualified for positions across agencies — but only under certain circumstances, and it’s still not widely done, so we mostly lose good applicants because they apply for one position and aren’t considered for others. These and a few other changes took a smart, dedicated, caring team three years to get done. I celebrated with the people responsible because it was a huge accomplishment for them. I am grateful to them and proud of them. I am not proud of the system in which that was a huge accomplishment.
Is someone against these minor and helpful changes? You could say that the vendors profit from government’s slow and poor process for hiring, but I struggle to imagine this was high on any lobbyists’ target lists. It's far more likely that there is just a perplexing combination of legitimate and imagined reasons for caution, and review by a staggering array of stakeholders. As I talked about in my book, outsiders (and certainly the right) imagine dangerously concentrated power in the executive branch, and seek to limit it. The reality is shockingly diffuse power. The bad outcomes they are fighting to prevent — burdensome, overreaching government — are the product of exactly the conditions they help create. Neither the left nor the right really has the mental models (nor, perhaps the desire) to effectively challenge the status quo of the technocracy.
But I don't want to dismiss the difficulty of confronting commercial interests. I think it’s fair to say that in the time I’ve been working on this issue we haven’t really seen the vendor ecosystem threatened in any meaningful way. The supplier base for government tech, for example, is not all that different from when I started Code for America in 2010. Anduril has made strides at the DoD for sure, and some startups have done some interesting stuff. Companies like Nava, a public benefit corporation whose costs and outcomes in contracting with agencies like CMS and VA should make it a strong choice across civilian government, is now over 500 people. But CGI Federal, just to pick one of the incumbents, is 90,000. Lockheed Martin is 122,000. Anduril is 3,500. It’s not like we’ve seen some massive shakeup.
You can tell a credible story about the resistance to change that doesn’t require any dirty tricks on the part of the incumbents, and I’ve often left it at that, in part because the vendors don’t come at me. I don’t threaten them. But they do go at those who do threaten them. Oracle recently admitted to some very dirty tricks in trying to keep their competitors from winning the Jedi cloud contract at the DoD. (More to say about this in a future post, perhaps. Or perhaps I’ll be too scared to attract their ire. It happens.) It’s not that they succeeded in getting the contract — they’d already lost it. They were just following their scorched earth policy of ensuring that if they couldn’t have it, no one could. And by no one, they meant their competitors, but the result is that the DoD still doesn’t have the access to the cloud that Jedi architect Chris Lynch envisioned. Our national defense is that much slower and less secure because Oracle can’t lose. These are some of the conditions under which change in government is supposed to take place
We can wish that the government efficiency agenda were in the hands of someone else, but let’s not pretend that change was going to come from Democrats if they’d only had another term, and let’s not delude ourselves that change was ever going to happen politely, neatly, carefully. However we got here, we may now be in a Godzilla vs Kong world. Perhaps we’re about to get a natural experiment in which Elonzilla faces off with Larry ElliKong. One of the things we need to be ready to learn is that Elonzilla could lose. Or worse, since Elon and Larry are friends, the expected disruptive could get co-opted. And what would that say about the problem? Conjuring Elon is not bringing a gun to a knife fight. It was never a knife fight
DOGE has made it both impossible not to talk about government reform, and impossible to talk about it. The topic is everywhere, but the subject is now entirely eclipsed on the left by the horror of who has been assigned the task and the need to decry DOGE as a bad faith effort. Elon wants to launch his rockets without government interference, Vivek wants to gut the civil service, both want to cater to cronies. I am being told on the socials that anyone engaging in discussion of how to shape this effort or what good it could potentially bring is enabling the ambitions of an autocracy. The problem itself, barely legible to Dems before DOGE, is off the table again.
But we do need to talk about government reform, and while I’m sorry the conditions are quite a bit less than ideal, I think it's time we admitted they were always going to be. Democrats did not do this work. Many wonderful public servants made valiant efforts and scored some great wins, but Democratic leadership did not make it a top priority to clear out the underbrush that jams the gears of government. Elon’s ambitions should not serve as cover for Dems to continue to abdicate responsibility here. Until we know more about what DOGE is planning, I support Dems like Ro Khanna for pledging to work with them.
I am guessing that those most worried that DOGE will succeed have never tried their hand at reforming government. It’s hard. But easier, you say, with no respect for the law, and the DOGE team will be unencumbered by such details. But that’s not true. The lawsuits will come. A lot of the government tech community is skipping the hand wringing; they've basically just grabbed a bag of popcorn and are watching in real time as Elon and Vivek learn all the things they’ve known, lived, and absolutely hated for their entire time in public service. They don’t see DOGE as their savior, but they are feeling vindicated after years of shouting into the void. I am struck by how different the tone of the DOGE conversation is between political leaders on the left and the people who’ve been fighting in the implementation trenches. One group is terrified they’ll succeed. The other is starting to ask a surprising question (or at least I am): What if even billionaires can’t disrupt the system we have built?
Take the issue of respect for the law. Put aside the headline grabbing issues for a second and live in the mundane world of implementation in government. If you’ve spent the past ten years trying to make, say, better online services for veterans, or clearer ways to understand your Medicare benefits, or even better ways to support warfighters, you’ve sat in countless -– and I mean countless — meetings where you’ve been told that something you were trying to do was illegal. Was it? Now, instead of launching your new web form or doing the user research your team needed to do, you spend weeks researching why you are now branded as dangerously lawless, only to find that either a) it was absolutely not illegal but 25 years ago someone wrote a memo that has since been interpreted as advising against this thing, b) no one had heard of the thing you were trying to do (the cloud, user research, A/B testing) and didn’t understand what you were talking about so had simply asserted it was illegal out of fear, c) there was an actual provision in law somewhere that did seem to address this and interpreting it required understanding both the actual intent of the law and the operational mechanics of the thing you were trying to do, which actually matched up pretty well or d) (and this one is uncommon) that the basic, common sense thing you were trying to do was actually illegal, which was clearly the result of a misunderstanding by policymakers or the people who draft legislation and policy on their behalf, and if they understood how their words had been operationalized, they’d be horrified. It is absolutely possible to both respect the rule of law, considering the democratic process and the peaceful transfer of power sacred, and have developed an aversion to the fetishization of law that perverts its intent. The majority of public servants I know have well earned this right.
DOGE is about to crash into this wall of weaponization of the complexities of law, policy, regulation, process, and lore in defense of the status quo and yes, the people in my community are watching. While our eyes are on potential abuses, they are also on the durability of the wall generally, and with deeply mixed emotions. It must be said: the wall is a problem. It is a problem for people who value the rule of law. It is a problem for people who care about an effective, responsive government
A few kind and well-meaning folks have suggested that I should be in charge of DOGE. The way they see it, if only someone with my values and knowledge of the situation were given the same power that Elon and Vivek have been given this would all be good. Yes, I have spent fifteen years studying roughly the same problem DOGE is now attacking (framed differently, but even I have framed it differently over the years.) But a wish for someone like me to be in charge misunderstands the nature of the problem. Diagnosis we have. The power to change we do not. Billionaires are in charge now because they have power. Elon in particular has what Ezra Klein correctly ascribed to Trump, which is lack of inhibition. Normal people like me get scared and ashamed when we’re told we’re doing something illegal. Elon does not. I wish it were different, but perhaps the job of breaking the wall has ended up with someone who is suited to doing it. The norms that constrain even people with far more power than I have make it very difficult to break through that wall.
It’s not that Dems haven’t tried the billionaire move. Obama’s Secretary of Defense Ash Carter started the Defense Innovation Board and appointed Eric Schmidt to lead it, for example. Carter’s hope was to transform the military into a modern, effective (and dare I say efficient) institution, and saw Schmidt as extra firepower for change, so to speak. Many grumped about the unaccountable power Schmidt would have, but I served on that board for four years and that concern was laughable. The DIB did a lot of great work that Eric should be proud of (I am), but it was merely advisory, set up according to Federal Advisory Committee Act rules. It didn’t help that the board quickly attracted at least one gadfly ethics-watcher, whose objections to Eric’s actions were so nitpicky and procedural she represented the worst of the Pentagon, the worst of public service, the worst of accountability. In the end, she was an annoying speed bump, but it didn’t matter. We wrote papers that said things like “the Pentagon already knows everything in this paper, but seems unable to act on it,” and watched as the Pentagon appeared to technically act on recommendations without fundamentally changing much. Other efforts like DIUx made real progress, but the dent in the machine is still tiny. I admire what Eric did on DIB. But the presence of a billionaire is certainly no guarantee of change. Mike Bloomberg has the job now, and is still proving the point. Today, we spend even more to get even less deterrence.
It's really hard to have an accurate model for why change is so hard in large bureaucratic institutions, and specifically for public sector ones, where the differences in governance really do matter. On the one hand, I do still believe that the first order problem is simply lack of attention by people with power. If politics and policy take their fair share of your oxygen, there's really just very little left for the implementation. What is available gets used on getting that particular thing done, which usually means a hack around the system instead of permanently changing it so it can be easy next time. At its worst, the hacks to get it done this time actually make it harder on an ongoing basis.
But it's not as if mere attention would solve the problem. There are entrenched interests for the status quo. It's easy to imagine these as exclusively or even mostly commercial interests, but if that were true, why would it take three years to issue guidance as anodyne as the hiring memo the Biden Administration put out this summer? Among other provisions, it declared job titles could now be listed accurately (you could say it was a software developer instead of an IT specialist) and that you could now share “certs” of candidates who qualified for positions across agencies — but only under certain circumstances, and it’s still not widely done, so we mostly lose good applicants because they apply for one position and aren’t considered for others. These and a few other changes took a smart, dedicated, caring team three years to get done. I celebrated with the people responsible because it was a huge accomplishment for them. I am grateful to them and proud of them. I am not proud of the system in which that was a huge accomplishment.
Is someone against these minor and helpful changes? You could say that the vendors profit from government’s slow and poor process for hiring, but I struggle to imagine this was high on any lobbyists’ target lists. It's far more likely that there is just a perplexing combination of legitimate and imagined reasons for caution, and review by a staggering array of stakeholders. As I talked about in my book, outsiders (and certainly the right) imagine dangerously concentrated power in the executive branch, and seek to limit it. The reality is shockingly diffuse power. The bad outcomes they are fighting to prevent — burdensome, overreaching government — are the product of exactly the conditions they help create. Neither the left nor the right really has the mental models (nor, perhaps the desire) to effectively challenge the status quo of the technocracy.
But I don't want to dismiss the difficulty of confronting commercial interests. I think it’s fair to say that in the time I’ve been working on this issue we haven’t really seen the vendor ecosystem threatened in any meaningful way. The supplier base for government tech, for example, is not all that different from when I started Code for America in 2010. Anduril has made strides at the DoD for sure, and some startups have done some interesting stuff. Companies like Nava, a public benefit corporation whose costs and outcomes in contracting with agencies like CMS and VA should make it a strong choice across civilian government, is now over 500 people. But CGI Federal, just to pick one of the incumbents, is 90,000. Lockheed Martin is 122,000. Anduril is 3,500. It’s not like we’ve seen some massive shakeup.
You can tell a credible story about the resistance to change that doesn’t require any dirty tricks on the part of the incumbents, and I’ve often left it at that, in part because the vendors don’t come at me. I don’t threaten them. But they do go at those who do threaten them. Oracle recently admitted to some very dirty tricks in trying to keep their competitors from winning the Jedi cloud contract at the DoD. (More to say about this in a future post, perhaps. Or perhaps I’ll be too scared to attract their ire. It happens.) It’s not that they succeeded in getting the contract — they’d already lost it. They were just following their scorched earth policy of ensuring that if they couldn’t have it, no one could. And by no one, they meant their competitors, but the result is that the DoD still doesn’t have the access to the cloud that Jedi architect Chris Lynch envisioned. Our national defense is that much slower and less secure because Oracle can’t lose. These are some of the conditions under which change in government is supposed to take place
We can wish that the government efficiency agenda were in the hands of someone else, but let’s not pretend that change was going to come from Democrats if they’d only had another term, and let’s not delude ourselves that change was ever going to happen politely, neatly, carefully. However we got here, we may now be in a Godzilla vs Kong world. Perhaps we’re about to get a natural experiment in which Elonzilla faces off with Larry ElliKong. One of the things we need to be ready to learn is that Elonzilla could lose. Or worse, since Elon and Larry are friends, the expected disruptive could get co-opted. And what would that say about the problem? Conjuring Elon is not bringing a gun to a knife fight. It was never a knife fight
Thursday, December 05, 2024
The many twists and turns in Voter behaviour
By TCA Srinivasa Raghavan in Hindu BusinessLine
P olitical attitudes, preferences and voting behaviour have been a topic of intense scrutiny for half a century. At the end of all the study, no one really knows how voters will vote on voting day. That’s why pollsters go wrong.
Recent Indian experience with welfare programmes should have led to Indian research on whether economic factors matter more than non-economic ones. But our academics are obsessed with varna and jati, caste and sub-caste.
Irecall reading a British paper from Oxford on the subject of voters and economics about 10 years ago. The question was whether voters look at the economy as a whole or only their own pockets, and whether past misery is less important than a promise of achche din. GUARANTEES AND VOTING An important question in this regard is whether they change their minds about how to vote after political parties bribe them with income guarantees. Or perhaps other benefits that come free? An interesting experiment in the US has been described in a working paper posted recently on the National Bureau of Economic Research (NBER) website.
It’s by David E. Broockman, Elizabeth Rhodes, Alexander W. Bartik, Karina Dotson, Sarah Miller, Patrick K. Krause and Eva Vivalt. (Working Paper 33214). They have studied what aects political attitudes and behaviour. They say that “a large positive income shock delivered through a private guaranteed income program had limited eects on most political outcomes, with a few notable exceptions.” What happens apparently is that political predispositions don’t get aected by such increases in income. People go on believing what they have always believed. Thus they say “Despite receiving a substantial income increase ($12,000 annually) over three years, participants showed minimal changes in a wide range of political views and behaviors, including political participation, party identification, policy preferences, trust in institutions, and support for democracy.”
They then say that previous research has also shown that “government-sponsored cash transfers often increase support for incumbents and voter turnout.” They go on to say that much depends on whether it is the government or someone else who makes the transfers.
Most importantly, it appears people view work as being preferable to handouts. They say more research is needed “into how experiencing guaranteed income programs shapes attitudes about work and deservingness.”
“The relative stability of most political attitudes in the face of a substantial income shock suggests that economic circumstances may primarily aect political behavior through mechanisms other than direct eects of income…”
Overall, all the analysis hasn’t led to anything definitive about voter behaviour. It’s as unpredictable as the behaviour of a bird sitting on a branch. (After Mao Zedong banned gambling, the Chinese who love it, used to bet on when the crow would fly o.)
CETERIS PARIBUS METHOD
Ithink the researchers might have framed the question wrongly. Perhaps they should be using the ceteris paribus method, namely, all other things being constant, how would a positive change in “economic incentives” alter voter preferences.
This would certainly be a more fruitful approach in India where sociological attitudes and preferences are virtually unshakable. Economic incentives in such a situation not only have to be very substantial but also increasing at a faster rate with each election. We have seen this happening over the last two decades.
Not that there haven’t been exceptions as when BSP voters in UP aligned with Brahmins. But that’s very rare. 99 per cent of the time voters stick to their sociologies.
So the key issue for Indian political parties is to ascertain how much is enough if voter behaviour is nearly unshakable.
Competition at the margin will leave everyone worse o as recent Assembly elections have shown.
P olitical attitudes, preferences and voting behaviour have been a topic of intense scrutiny for half a century. At the end of all the study, no one really knows how voters will vote on voting day. That’s why pollsters go wrong.
Recent Indian experience with welfare programmes should have led to Indian research on whether economic factors matter more than non-economic ones. But our academics are obsessed with varna and jati, caste and sub-caste.
Irecall reading a British paper from Oxford on the subject of voters and economics about 10 years ago. The question was whether voters look at the economy as a whole or only their own pockets, and whether past misery is less important than a promise of achche din. GUARANTEES AND VOTING An important question in this regard is whether they change their minds about how to vote after political parties bribe them with income guarantees. Or perhaps other benefits that come free? An interesting experiment in the US has been described in a working paper posted recently on the National Bureau of Economic Research (NBER) website.
It’s by David E. Broockman, Elizabeth Rhodes, Alexander W. Bartik, Karina Dotson, Sarah Miller, Patrick K. Krause and Eva Vivalt. (Working Paper 33214). They have studied what aects political attitudes and behaviour. They say that “a large positive income shock delivered through a private guaranteed income program had limited eects on most political outcomes, with a few notable exceptions.” What happens apparently is that political predispositions don’t get aected by such increases in income. People go on believing what they have always believed. Thus they say “Despite receiving a substantial income increase ($12,000 annually) over three years, participants showed minimal changes in a wide range of political views and behaviors, including political participation, party identification, policy preferences, trust in institutions, and support for democracy.”
They then say that previous research has also shown that “government-sponsored cash transfers often increase support for incumbents and voter turnout.” They go on to say that much depends on whether it is the government or someone else who makes the transfers.
Most importantly, it appears people view work as being preferable to handouts. They say more research is needed “into how experiencing guaranteed income programs shapes attitudes about work and deservingness.”
“The relative stability of most political attitudes in the face of a substantial income shock suggests that economic circumstances may primarily aect political behavior through mechanisms other than direct eects of income…”
Overall, all the analysis hasn’t led to anything definitive about voter behaviour. It’s as unpredictable as the behaviour of a bird sitting on a branch. (After Mao Zedong banned gambling, the Chinese who love it, used to bet on when the crow would fly o.)
CETERIS PARIBUS METHOD
Ithink the researchers might have framed the question wrongly. Perhaps they should be using the ceteris paribus method, namely, all other things being constant, how would a positive change in “economic incentives” alter voter preferences.
This would certainly be a more fruitful approach in India where sociological attitudes and preferences are virtually unshakable. Economic incentives in such a situation not only have to be very substantial but also increasing at a faster rate with each election. We have seen this happening over the last two decades.
Not that there haven’t been exceptions as when BSP voters in UP aligned with Brahmins. But that’s very rare. 99 per cent of the time voters stick to their sociologies.
So the key issue for Indian political parties is to ascertain how much is enough if voter behaviour is nearly unshakable.
Competition at the margin will leave everyone worse o as recent Assembly elections have shown.
Friday, November 29, 2024
Wednesday, November 27, 2024
Sunday, November 03, 2024
Germany - Sick leave
Laura Pitel in Berlin NOVEMBER 1 2024
German business executives have warned that high levels of sick leave are damaging the competitiveness of Europe’s largest economy and compounding its economic woes.
Workers missed an average of 19.4 days because of illness in 2023, according to Techniker Krankenkasse, the country’s largest public health insurance provider.
Preliminary figures suggest the trend is on course to continue its upward trajectory, TK told the Financial Times, exacerbating challenges for an economy that many expect to contract for the second year running in 2024.
While it is notoriously difficult to compare data from country to country, Christopher Prinz, an expert on employment at the OECD, said Germany was “definitely among the higher countries” when it came to sick leave.
The issue has fed into a debate about the future of the country’s economic model, with high energy prices, labour shortages and stifling bureaucracy hitting manufacturers that have for decades driven growth.
An executive at a blue-chip manufacturer lamented “a complete unwillingness”, especially among some “work-shy” younger workers, to understand the sacrifices needed to maintain prosperity and competitiveness.
“And then everyone wonders why Germany is the sick man of Europe,” he said.
Paul Niederstein, co-owner and chief executive of steel galvanising business Coatinc, which has about 600 employees in Germany and 900 elsewhere, said the high absence rate was a symptom of a labour force that had become “too spoilt and too self-confident”.
A study published in January by the German Association of Research-Based Pharmaceutical Companies (VFA), an industry body, found that were it not for the country’s above-average number of sick days, the German economy would have grown 0.5 per cent last year, rather than shrinking 0.3 per cent.
Claus Michelsen, the study’s author, said high sick levels were exacerbating a shortage of skilled workers.
Heads at Elon Musk’s electric-car maker Tesla in September sought to tackle high sick rates by conducting unannounced home visits to check up on absent employees at its factory near Berlin.
While few German executives support such a controversial approach, there is deep unease in many companies about the trend.
Mercedes-Benz chief executive Ola Källenius recently claimed that sickness absence in its Germany production was sometimes twice as high as in other countries, despite the same conditions.
“As employers, we do a lot to support people: from occupational safety and ergonomic work processes to health advice, flu vaccinations and resilience training,” he told Der Spiegel. “But it takes all sides to achieve an improvement here.”
The TK data show the biggest change, besides a post-coronavirus bump in respiratory illnesses, has come from a steep rise in mental health cases since the turn of the millennium.
There has been growing criticism of pandemic-era rules enabling patients to receive sick notes from a doctor by telephone without a face-to-face examination.
Finance minister Christian Lindner said in September that there was “a correlation between the annual sick leave in Germany and the introduction of the measure” and called for it to be abolished. The country’s association of general practitioners this week pushed back, saying the measure was a rare success story in efforts to reduce bureaucracy in the healthcare system.
But Gerd Röders, who runs a 200-year-old family business supplying parts to the automotive, aviation and pharmaceutical sectors, said it was too easy for workers to be written off sick by a doctor. He suggested that the first three days of absence through sickness could be unpaid. “I don’t want to sound like an asshole, but maybe it would make people think twice,” he said.
Even before the pandemic, sick leave rates were among the highest in the developed world.
OECD data on compensated absence from work due to illness — compiled from sources including health ministries and health insurers — shows Germany’s rate as the highest in the group of advanced countries, with 22.4 days a year in 2022, the latest available data.
The OECD’s labour force survey, which Prinz said enabled better comparisons because it was self-reported by workers, places Germany seventh — behind countries such as Norway, Finland, Spain and France — with 6.8 per cent of workers’ usual weekly hours lost due to sickness absence.
The OECD survey did not provide figures for the UK, where the statutory sick pay regime is one of the least generous in the developed world. Sickness absence has also increased in Britain since the pandemic, but to a much lower level. The latest UK data, for 2022, shows an absence rate of 2.6 per cent, up from 2 per cent in 2019.
In Germany, all employees are entitled by law to six weeks’ sick leave a year at full pay. If an employee comes down with an illness during a holiday, and secures a doctor’s note to prove it, they can claim back those days of leave and use them another time.
Prinz said it was possible that Germany’s policies were well designed. “We want people who are sick to be on sick leave. There’s a strong argument for sickness schemes actually helping productivity, health and labour market participation.”
Hans-Jürgen Urban, board member of the country’s largest industrial trade union IG Metall, said high sick leave levels in a company should be seen as an “alarm signal” that indicates a raft of underlying pressures on workers. “Anyone who complains about high levels of absenteeism must look for the root causes: in the workplace itself.”
German workers also took a big hit to their pay in real terms following the pandemic, the global energy shock caused by Russia’s invasion of Ukraine and the high inflation that followed.
Despite the weak German economy, wages have been catching up rapidly this year, but remain below their pre-pandemic levels once adjusted for inflation.
Andreas Tautz, chief medical officer at DHL Group, which has about 600,000 employees worldwide and 220,000 in Germany, stressed that Germany was “still one of the most productive countries [in the world]”.
However, in terms of productivity growth, the outlook was less rosy, with a contraction last year.
Coatinc’s Niederstein said it was important for companies to be self-critical, warning that high sickness rates could reflect poor culture and leadership.
But he added that workers were also “not willing” to appreciate the pressure businesses were under and “understand what happens in Mexico or Turkey or other countries”.
“Germany needs to be much less arrogant and needs to reflect the international business environment that we compete with,” he said.
Additional reporting by Valentina Romei and Delphine Strauss. Data visualisation by Janina Conboye
German business executives have warned that high levels of sick leave are damaging the competitiveness of Europe’s largest economy and compounding its economic woes.
Workers missed an average of 19.4 days because of illness in 2023, according to Techniker Krankenkasse, the country’s largest public health insurance provider.
Preliminary figures suggest the trend is on course to continue its upward trajectory, TK told the Financial Times, exacerbating challenges for an economy that many expect to contract for the second year running in 2024.
While it is notoriously difficult to compare data from country to country, Christopher Prinz, an expert on employment at the OECD, said Germany was “definitely among the higher countries” when it came to sick leave.
The issue has fed into a debate about the future of the country’s economic model, with high energy prices, labour shortages and stifling bureaucracy hitting manufacturers that have for decades driven growth.
An executive at a blue-chip manufacturer lamented “a complete unwillingness”, especially among some “work-shy” younger workers, to understand the sacrifices needed to maintain prosperity and competitiveness.
“And then everyone wonders why Germany is the sick man of Europe,” he said.
Paul Niederstein, co-owner and chief executive of steel galvanising business Coatinc, which has about 600 employees in Germany and 900 elsewhere, said the high absence rate was a symptom of a labour force that had become “too spoilt and too self-confident”.
A study published in January by the German Association of Research-Based Pharmaceutical Companies (VFA), an industry body, found that were it not for the country’s above-average number of sick days, the German economy would have grown 0.5 per cent last year, rather than shrinking 0.3 per cent.
Claus Michelsen, the study’s author, said high sick levels were exacerbating a shortage of skilled workers.
Heads at Elon Musk’s electric-car maker Tesla in September sought to tackle high sick rates by conducting unannounced home visits to check up on absent employees at its factory near Berlin.
While few German executives support such a controversial approach, there is deep unease in many companies about the trend.
Mercedes-Benz chief executive Ola Källenius recently claimed that sickness absence in its Germany production was sometimes twice as high as in other countries, despite the same conditions.
“As employers, we do a lot to support people: from occupational safety and ergonomic work processes to health advice, flu vaccinations and resilience training,” he told Der Spiegel. “But it takes all sides to achieve an improvement here.”
The TK data show the biggest change, besides a post-coronavirus bump in respiratory illnesses, has come from a steep rise in mental health cases since the turn of the millennium.
There has been growing criticism of pandemic-era rules enabling patients to receive sick notes from a doctor by telephone without a face-to-face examination.
Finance minister Christian Lindner said in September that there was “a correlation between the annual sick leave in Germany and the introduction of the measure” and called for it to be abolished. The country’s association of general practitioners this week pushed back, saying the measure was a rare success story in efforts to reduce bureaucracy in the healthcare system.
But Gerd Röders, who runs a 200-year-old family business supplying parts to the automotive, aviation and pharmaceutical sectors, said it was too easy for workers to be written off sick by a doctor. He suggested that the first three days of absence through sickness could be unpaid. “I don’t want to sound like an asshole, but maybe it would make people think twice,” he said.
Even before the pandemic, sick leave rates were among the highest in the developed world.
OECD data on compensated absence from work due to illness — compiled from sources including health ministries and health insurers — shows Germany’s rate as the highest in the group of advanced countries, with 22.4 days a year in 2022, the latest available data.
The OECD’s labour force survey, which Prinz said enabled better comparisons because it was self-reported by workers, places Germany seventh — behind countries such as Norway, Finland, Spain and France — with 6.8 per cent of workers’ usual weekly hours lost due to sickness absence.
The OECD survey did not provide figures for the UK, where the statutory sick pay regime is one of the least generous in the developed world. Sickness absence has also increased in Britain since the pandemic, but to a much lower level. The latest UK data, for 2022, shows an absence rate of 2.6 per cent, up from 2 per cent in 2019.
In Germany, all employees are entitled by law to six weeks’ sick leave a year at full pay. If an employee comes down with an illness during a holiday, and secures a doctor’s note to prove it, they can claim back those days of leave and use them another time.
Prinz said it was possible that Germany’s policies were well designed. “We want people who are sick to be on sick leave. There’s a strong argument for sickness schemes actually helping productivity, health and labour market participation.”
Hans-Jürgen Urban, board member of the country’s largest industrial trade union IG Metall, said high sick leave levels in a company should be seen as an “alarm signal” that indicates a raft of underlying pressures on workers. “Anyone who complains about high levels of absenteeism must look for the root causes: in the workplace itself.”
German workers also took a big hit to their pay in real terms following the pandemic, the global energy shock caused by Russia’s invasion of Ukraine and the high inflation that followed.
Despite the weak German economy, wages have been catching up rapidly this year, but remain below their pre-pandemic levels once adjusted for inflation.
Andreas Tautz, chief medical officer at DHL Group, which has about 600,000 employees worldwide and 220,000 in Germany, stressed that Germany was “still one of the most productive countries [in the world]”.
However, in terms of productivity growth, the outlook was less rosy, with a contraction last year.
Coatinc’s Niederstein said it was important for companies to be self-critical, warning that high sickness rates could reflect poor culture and leadership.
But he added that workers were also “not willing” to appreciate the pressure businesses were under and “understand what happens in Mexico or Turkey or other countries”.
“Germany needs to be much less arrogant and needs to reflect the international business environment that we compete with,” he said.
Additional reporting by Valentina Romei and Delphine Strauss. Data visualisation by Janina Conboye
Thursday, October 31, 2024
Wednesday, October 30, 2024
Saturday, October 26, 2024
The Past , Present, and Future of Office work
Welcome back! Last week we talked about the economic importance of communication and the rise and fall of the telephone operator occupation, which mostly employed young women. Between 1910 and 1940, AT&T and its satellite companies rapidly adopted mechanical switching, which quickly put telephone operators out of work. Yet future cohorts weren’t harmed at all. They simply found other jobs, especially as typists and secretaries.
Why did typist and secretary jobs grow so rapidly in the early 1900s? Interestingly, the word “secretary” doesn’t even appear in the U.S. Census occupation descriptions until 1940. Occupation code number 236 in the 1940 Census is described as “stenographers, typists and secretaries.” From 1910 to 1940 it was just “stenographers and typists”, and before 1910 all office work was categorized simply as “clerks and copyists”.
New job titles often augur substantive changes in work, and this is no exception. The secretary occupation evolved over the first half of the 20th century alongside rapid technological advances that aided the transcription and communication of language from spoken to written form.
The economic importance of note-taking
The original technology for transcribing spoken words was shorthand. The practice of writing shorthand dates back at least to ancient Greece (the word stenography comes from the Greek stenos “narrow” and graphein “to write”), but scholars have found many examples of shorthand from Rome, Imperial China, Japan, and other ancient cultures.
Shorthand exists because people have highly imperfect memories, and because we can speak faster than we can write. Recording speech is especially important when words must be memorialized for legal reasons, such as in court proceedings. James Madison took notes on the Constitutional Convention in his own shorthand, making him the sole recorder of the founding principles of the newly formed United States of America.
Like language, shorthand existed in many forms, but some eventually became more dominant than others. Probably the best-known system is Gregg shorthand, which is still used today for court reporting. Gregg is about three times faster than regular writing, primarily because it records sounds rather than spelling (omitting silent letters) and uses abbreviations for common words (e.g. “k” for “can”). Here is as an example of a sentence written using Gregg shorthand – you can see how it saves time to write this way!
Gregg shorthand allowed stenographers to reach speeds upward of 100 words per minute, compared to about 30 for regular handwriting. That sounds impressive, but the average person speaks at about 150 words per minute, faster than most shorthand writers. However, like most tasks, machines help you do it faster. Miles Bartholomew invented the shorthand machine in 1879, pictured below.
As you can see above, the stenotype keyboard is much smaller than a regular keyboard. Words are written by punching the keys together in combinations (for example, pressing the K, the A, and the T together is how you write the word “cat”).
With modern machines, stenographers can reach typing speeds of up to 300 words per minute, about ten times the speed of normal handwriting and twice as fast as the spoken word.
If shorthand is so great, why isn’t everybody using it? The barriers to entry are too high. Gregg shorthand takes months to master, and as you can see from the drawing above, it is utterly incomprehensible to outsiders. Shorthand greatly lowers the cost of transcription, but not the cost of communication, because non-stenographers can’t understand it.
The typewriter revolutionized written communication The invention that truly unlocked mass communication was the typewriter. The typewriter was independently “invented” many times during the early 19th century, but early prototypes were mostly curiosities that didn’t pass a benefit-cost test relative to shorthand or regular handwriting.1 The first commercially successful typewriter was the Remington No.2, which sold about 100,000 units between 1874 and 1891.
Even though the Remington No.2 was very expensive (it cost about $100, which is roughly $4,000 in 2024 dollars), it was clearly a communications breakthrough, for two reasons. First, Remington showed in a highly publicized contest (see the poster below) that people could reach speeds of 100 words per minute by “touch typing (e.g. using most of their fingers and not looking at the keyboard, the way you are taught in typing class). This offered the possibility of near-perfect translation from spoken to written word, without the need for shorthand. Second, with carbon paper and typewriter stencils, typewriters could make multiple copies of the same document.2
The secretary occupation evolved alongside the mass adoption of the typewriter. Typewriter production exploded in the early 1900s, with major brands like Underwood selling an estimated 5 million units between 1900 and 1930. Growth in typewriter sales was driven by gradual improvements in quality and cost, including front striking (so you could see what you were typing) and eventually the electric typewriter.3 Large companies increasingly employed secretarial “pools”, groups of secretaries who were deployed to executives as needed for typing and other office duties. Gibbs college, the first secretarial school, opened in 1911.
Over a period where the typewriter was rapidly adopted, the occupation “typists and secretaries” grew sixfold, from less than 0.5 percent of all employment in 1900 to 3 percent in 1950.
How the typewriter fueled the rise of office work
Technological innovation in communication increased total demand for communication jobs. New technology didn’t destroy jobs because typewriters can’t write by themselves – they need a human operator. For the first time in history, we could inexpensively create accurate written accounts of meetings, phone calls, and other events. Businesses could also keep records of important transactions such as sales and expenses, and they could store and maintain customers and client information.
Over this same period, we also saw growing demand for other office functions related to information storage and retrieval. The figure below plots the trend over time in office and administrative support occupations. The solid blue line shows employment of financial clerks - people who keep records of finances, payroll and accounts and file and process paperwork from customers and clients. The red dashed line shows typists, secretaries, and administrative assistants and is identical to the chart from last week’s post. Finally, the dotted green line shows employment in other back-office jobs like proofreaders, office machine operators, data entry keyers, and general office clerks.
At its peak in 1980, office and administrative support work accounted for 12.7% of all workers in the U.S. economy. That’s one in eight jobs devoted entirely to the production, processing, storage, delivery, and retrieval of written information.
Yet since 1980, employment in all three occupation categories has declined rapidly, falling from 12.7% to only 6.8% in 2022. Today, secretaries and administrative assistants are as common as a share of all jobs as they were in 1920.
Digitization and the nonrivalry of data
What explains the decline of office work since 1980? If you’ve been reading regularly, you already know the answer. It was the personal computer. Like the typewriter, the personal computer facilitated the recording and storage of information in written form. However, unlike the typewriter, computers can record, store, and manipulate information in digital rather than physical form.
Digital information storage has several advantages. First, you can more easily make changes to a document without having to reproduce the whole thing. Second, filing and organizing is much easier because documents can be sorted on multiple characteristics (you can search the files on your hard drive by keyword, date, or folder - but physical documents can only be in one place). Third, the absence of a physical form means that digital information can more easily be copied, delivered, and preserved. In the language of economics, digital information is nonrival, meaning usage by one person does not crowd out usage by another person. A physical document can only be in one place at a time.
Once information became easy to store and manipulate, we no longer needed so many people to transcribe, codify, and organize it. Secretaries focused increasingly on other duties, like scheduling and coordinating meetings and personal assistance. Yet office calendars are increasingly digitized and synced up within organizations, and only high-level executives have their own assistant. The job of secretary/administrative assistant is likely in permanent decline.
The nonrivalry of digital information (e.g. data) has had broader impacts on the overall organization of the economy. Digitization has lowered to zero the cost of reproducing information, and it has greatly lowered the cost of manipulating and editing documents and other forms of digital data. Basic economic reasoning tells us that falling costs leads to an increase in supply. At the dawn of the 20th century, the U.S. economy was data-scarce – just codifying information alone had economic value. A hundred years later, we are data-drenched, to the point of drowning.
By some estimates, the internet in 2024 collectively stores about 147 zettabytes (ZB) of data.4 One ZB is equivalent to the storage capacity of 250 billion DVDs! Ex-Google CEO Eric Schmidt estimated that only about 0.05 ZBs of information was created over humanity’s entire history up to 2003.
When information is abundant, the ability to make sense of it becomes especially valuable. This explains why managerial, professional, and technical occupations have grown since 1980, as rapidly as clerical work has declined. These jobs require you to go beyond collecting and storing information.
Jobs with titles like “business analyst”, “consultant”, and “solutions architect” require workers analyze and synthesize information in ways that improve business decision-making. In some ways, it is data compression, not collection – distilling a sea of information down to its most critical elements. When you have access to more than 250 billion DVDs worth of data, it’s important to know what you are looking for!
Artificial intelligence - information processing made easy
We can think of the large language models (LLMs) underpinning generative AI tools as performing incredibly sophisticated operations on data (primarily words). Each time you ask ChatGPT a question, it is effectively compressing all 147 zettabytes of the internet in a way that delivers a highly customized response to your specific query.
Generative AI commodifies the manipulation of digital information. It may take half a century, but I believe it will eventually lead to the extinction of office and administrative support jobs like administrative assistants and financial clerks. The entire purpose of these jobs is to lower the cost of transmitting and storing information. In the long-run, AI will drive the cost of “routine” information processing down to nearly zero, eventually eliminating the need for most human labor in those jobs (although we will need a lot more energy efficiency to get there!)
There is a clear analogy here to the impact of mechanization on farm labor. For most of human history, the bottleneck to increasing food production was physical power. Steam and electricity eventually relaxed that constraint, and farm work mostly disappeared because we only need so much food.
Similarly, the key bottleneck in business decision-making for most of modern history was a lack of information. Advances in information collection, storage and retrieval eventually relaxed that constraint, and now we are awash in data. Routine office jobs were created in a time of information scarcity, and they may no longer be needed.
A harder question is whether AI will eventually replace jobs that analyze and synthesize information to improve decision-making. I am less certain that will happen, for two reasons. First, there is an essential complementarity between the ways that LLMs and humans analyze information. Generative AI models excel at any task for which there are many existing examples in their training data. They can access information about anything that has ever happened anywhere repeatedly, a feat far too difficult for the human mind. People, on the other hand, are excellent guessers. We reason remarkably well with very little data. Because AI models and humans approach problems differently, I can imagine lots of situations where people who know when to use AI and when to overrule it will do better than either party acting alone.
Second, economic interactions are often strategic, meaning the right decision depends on what you think your competitors will do and how they will respond. Because LLMs reason from training data, they are never fully up to date. Concretely, imagine that two companies are developing strategies to outcompete their opponent for market share. They use a frontier AI model to tell them where they should locate new stores, but the right answer depends on what their opponent is doing, which in turn depends on what AI model the opponent is using. If you both have access to the same AI technology, the winner will be the company with a better human in charge.5
Still, the frontier of AI technology is advancing rapidly. I am making predictions based only on what I see today and some quasi-linear extrapolation to the near future. If the day comes that an AI agent can run a Fortune 500 company without human assistance, then I, for one, will welcome our new robot overlords.
1 The Early Office Museum contains some wonderfully quirky examples of antique typing machines, including the Kaligraph, Charles Thurber’s Patent Printer, and the Hansen Writing Ball.
2 Two other important innovations were 1) the shift key, which moved a different part of the typebar to contact with the ribbon, allowing for both upper- and lowercase letters to be used without changing the typebar manually; and 2) the QWERTY keyboard layout, which minimized typebar jams by spacing out frequent letter combinations.
3 Electric typewriters were much more reliable and allowed for other complementary improvements like proportional spacing and the typeball or “golfball” design, which reduced jams and allowed multiple fonts to be used in the same document.
4 Caveat – I have no idea how they arrived at this number! It seems like a hard thing to estimate.
5 The idea that what I do depends on what you will do, which depends on what I do, and so on is called “level-K reasoning”. People have already programmed AI agents with level-K reasoning capabilities, but I am not aware of any evidence on whether such agents can reliably outperform people in real-world strategic interactions.
Why did typist and secretary jobs grow so rapidly in the early 1900s? Interestingly, the word “secretary” doesn’t even appear in the U.S. Census occupation descriptions until 1940. Occupation code number 236 in the 1940 Census is described as “stenographers, typists and secretaries.” From 1910 to 1940 it was just “stenographers and typists”, and before 1910 all office work was categorized simply as “clerks and copyists”.
New job titles often augur substantive changes in work, and this is no exception. The secretary occupation evolved over the first half of the 20th century alongside rapid technological advances that aided the transcription and communication of language from spoken to written form.
The economic importance of note-taking
The original technology for transcribing spoken words was shorthand. The practice of writing shorthand dates back at least to ancient Greece (the word stenography comes from the Greek stenos “narrow” and graphein “to write”), but scholars have found many examples of shorthand from Rome, Imperial China, Japan, and other ancient cultures.
Shorthand exists because people have highly imperfect memories, and because we can speak faster than we can write. Recording speech is especially important when words must be memorialized for legal reasons, such as in court proceedings. James Madison took notes on the Constitutional Convention in his own shorthand, making him the sole recorder of the founding principles of the newly formed United States of America.
Like language, shorthand existed in many forms, but some eventually became more dominant than others. Probably the best-known system is Gregg shorthand, which is still used today for court reporting. Gregg is about three times faster than regular writing, primarily because it records sounds rather than spelling (omitting silent letters) and uses abbreviations for common words (e.g. “k” for “can”). Here is as an example of a sentence written using Gregg shorthand – you can see how it saves time to write this way!
Gregg shorthand allowed stenographers to reach speeds upward of 100 words per minute, compared to about 30 for regular handwriting. That sounds impressive, but the average person speaks at about 150 words per minute, faster than most shorthand writers. However, like most tasks, machines help you do it faster. Miles Bartholomew invented the shorthand machine in 1879, pictured below.
As you can see above, the stenotype keyboard is much smaller than a regular keyboard. Words are written by punching the keys together in combinations (for example, pressing the K, the A, and the T together is how you write the word “cat”).
With modern machines, stenographers can reach typing speeds of up to 300 words per minute, about ten times the speed of normal handwriting and twice as fast as the spoken word.
If shorthand is so great, why isn’t everybody using it? The barriers to entry are too high. Gregg shorthand takes months to master, and as you can see from the drawing above, it is utterly incomprehensible to outsiders. Shorthand greatly lowers the cost of transcription, but not the cost of communication, because non-stenographers can’t understand it.
The typewriter revolutionized written communication The invention that truly unlocked mass communication was the typewriter. The typewriter was independently “invented” many times during the early 19th century, but early prototypes were mostly curiosities that didn’t pass a benefit-cost test relative to shorthand or regular handwriting.1 The first commercially successful typewriter was the Remington No.2, which sold about 100,000 units between 1874 and 1891.
Even though the Remington No.2 was very expensive (it cost about $100, which is roughly $4,000 in 2024 dollars), it was clearly a communications breakthrough, for two reasons. First, Remington showed in a highly publicized contest (see the poster below) that people could reach speeds of 100 words per minute by “touch typing (e.g. using most of their fingers and not looking at the keyboard, the way you are taught in typing class). This offered the possibility of near-perfect translation from spoken to written word, without the need for shorthand. Second, with carbon paper and typewriter stencils, typewriters could make multiple copies of the same document.2
The secretary occupation evolved alongside the mass adoption of the typewriter. Typewriter production exploded in the early 1900s, with major brands like Underwood selling an estimated 5 million units between 1900 and 1930. Growth in typewriter sales was driven by gradual improvements in quality and cost, including front striking (so you could see what you were typing) and eventually the electric typewriter.3 Large companies increasingly employed secretarial “pools”, groups of secretaries who were deployed to executives as needed for typing and other office duties. Gibbs college, the first secretarial school, opened in 1911.
Over a period where the typewriter was rapidly adopted, the occupation “typists and secretaries” grew sixfold, from less than 0.5 percent of all employment in 1900 to 3 percent in 1950.
How the typewriter fueled the rise of office work
Technological innovation in communication increased total demand for communication jobs. New technology didn’t destroy jobs because typewriters can’t write by themselves – they need a human operator. For the first time in history, we could inexpensively create accurate written accounts of meetings, phone calls, and other events. Businesses could also keep records of important transactions such as sales and expenses, and they could store and maintain customers and client information.
Over this same period, we also saw growing demand for other office functions related to information storage and retrieval. The figure below plots the trend over time in office and administrative support occupations. The solid blue line shows employment of financial clerks - people who keep records of finances, payroll and accounts and file and process paperwork from customers and clients. The red dashed line shows typists, secretaries, and administrative assistants and is identical to the chart from last week’s post. Finally, the dotted green line shows employment in other back-office jobs like proofreaders, office machine operators, data entry keyers, and general office clerks.
At its peak in 1980, office and administrative support work accounted for 12.7% of all workers in the U.S. economy. That’s one in eight jobs devoted entirely to the production, processing, storage, delivery, and retrieval of written information.
Yet since 1980, employment in all three occupation categories has declined rapidly, falling from 12.7% to only 6.8% in 2022. Today, secretaries and administrative assistants are as common as a share of all jobs as they were in 1920.
Digitization and the nonrivalry of data
What explains the decline of office work since 1980? If you’ve been reading regularly, you already know the answer. It was the personal computer. Like the typewriter, the personal computer facilitated the recording and storage of information in written form. However, unlike the typewriter, computers can record, store, and manipulate information in digital rather than physical form.
Digital information storage has several advantages. First, you can more easily make changes to a document without having to reproduce the whole thing. Second, filing and organizing is much easier because documents can be sorted on multiple characteristics (you can search the files on your hard drive by keyword, date, or folder - but physical documents can only be in one place). Third, the absence of a physical form means that digital information can more easily be copied, delivered, and preserved. In the language of economics, digital information is nonrival, meaning usage by one person does not crowd out usage by another person. A physical document can only be in one place at a time.
Once information became easy to store and manipulate, we no longer needed so many people to transcribe, codify, and organize it. Secretaries focused increasingly on other duties, like scheduling and coordinating meetings and personal assistance. Yet office calendars are increasingly digitized and synced up within organizations, and only high-level executives have their own assistant. The job of secretary/administrative assistant is likely in permanent decline.
The nonrivalry of digital information (e.g. data) has had broader impacts on the overall organization of the economy. Digitization has lowered to zero the cost of reproducing information, and it has greatly lowered the cost of manipulating and editing documents and other forms of digital data. Basic economic reasoning tells us that falling costs leads to an increase in supply. At the dawn of the 20th century, the U.S. economy was data-scarce – just codifying information alone had economic value. A hundred years later, we are data-drenched, to the point of drowning.
By some estimates, the internet in 2024 collectively stores about 147 zettabytes (ZB) of data.4 One ZB is equivalent to the storage capacity of 250 billion DVDs! Ex-Google CEO Eric Schmidt estimated that only about 0.05 ZBs of information was created over humanity’s entire history up to 2003.
When information is abundant, the ability to make sense of it becomes especially valuable. This explains why managerial, professional, and technical occupations have grown since 1980, as rapidly as clerical work has declined. These jobs require you to go beyond collecting and storing information.
Jobs with titles like “business analyst”, “consultant”, and “solutions architect” require workers analyze and synthesize information in ways that improve business decision-making. In some ways, it is data compression, not collection – distilling a sea of information down to its most critical elements. When you have access to more than 250 billion DVDs worth of data, it’s important to know what you are looking for!
Artificial intelligence - information processing made easy
We can think of the large language models (LLMs) underpinning generative AI tools as performing incredibly sophisticated operations on data (primarily words). Each time you ask ChatGPT a question, it is effectively compressing all 147 zettabytes of the internet in a way that delivers a highly customized response to your specific query.
Generative AI commodifies the manipulation of digital information. It may take half a century, but I believe it will eventually lead to the extinction of office and administrative support jobs like administrative assistants and financial clerks. The entire purpose of these jobs is to lower the cost of transmitting and storing information. In the long-run, AI will drive the cost of “routine” information processing down to nearly zero, eventually eliminating the need for most human labor in those jobs (although we will need a lot more energy efficiency to get there!)
There is a clear analogy here to the impact of mechanization on farm labor. For most of human history, the bottleneck to increasing food production was physical power. Steam and electricity eventually relaxed that constraint, and farm work mostly disappeared because we only need so much food.
Similarly, the key bottleneck in business decision-making for most of modern history was a lack of information. Advances in information collection, storage and retrieval eventually relaxed that constraint, and now we are awash in data. Routine office jobs were created in a time of information scarcity, and they may no longer be needed.
A harder question is whether AI will eventually replace jobs that analyze and synthesize information to improve decision-making. I am less certain that will happen, for two reasons. First, there is an essential complementarity between the ways that LLMs and humans analyze information. Generative AI models excel at any task for which there are many existing examples in their training data. They can access information about anything that has ever happened anywhere repeatedly, a feat far too difficult for the human mind. People, on the other hand, are excellent guessers. We reason remarkably well with very little data. Because AI models and humans approach problems differently, I can imagine lots of situations where people who know when to use AI and when to overrule it will do better than either party acting alone.
Second, economic interactions are often strategic, meaning the right decision depends on what you think your competitors will do and how they will respond. Because LLMs reason from training data, they are never fully up to date. Concretely, imagine that two companies are developing strategies to outcompete their opponent for market share. They use a frontier AI model to tell them where they should locate new stores, but the right answer depends on what their opponent is doing, which in turn depends on what AI model the opponent is using. If you both have access to the same AI technology, the winner will be the company with a better human in charge.5
Still, the frontier of AI technology is advancing rapidly. I am making predictions based only on what I see today and some quasi-linear extrapolation to the near future. If the day comes that an AI agent can run a Fortune 500 company without human assistance, then I, for one, will welcome our new robot overlords.
1 The Early Office Museum contains some wonderfully quirky examples of antique typing machines, including the Kaligraph, Charles Thurber’s Patent Printer, and the Hansen Writing Ball.
2 Two other important innovations were 1) the shift key, which moved a different part of the typebar to contact with the ribbon, allowing for both upper- and lowercase letters to be used without changing the typebar manually; and 2) the QWERTY keyboard layout, which minimized typebar jams by spacing out frequent letter combinations.
3 Electric typewriters were much more reliable and allowed for other complementary improvements like proportional spacing and the typeball or “golfball” design, which reduced jams and allowed multiple fonts to be used in the same document.
4 Caveat – I have no idea how they arrived at this number! It seems like a hard thing to estimate.
5 The idea that what I do depends on what you will do, which depends on what I do, and so on is called “level-K reasoning”. People have already programmed AI agents with level-K reasoning capabilities, but I am not aware of any evidence on whether such agents can reliably outperform people in real-world strategic interactions.
Subscribe to:
Posts (Atom)