Why AI ‘Feels’ Different by Generation: Quantifying the Perceived Rate of Technological Waves & the Associated Risks
- Joshua Russell
- Sep 28
- 7 min read
Joshua Russell, MD
Introduction–When Pace is Personal
As I approach another birthday, I have been thinking more about what my age means in today’s world and why societal perceptions of age and its meaning have changed over time. Each recent generation has been defined by the wave of technology that unfolded during the formative years of its membership. The ‘Greatest Generation’ watched television sweep into their living rooms almost overnight. My peers in their forties remember when they first got internet access and the cacophony of clicks and crashes that corresponded to the connection of their dial-up modem. The younger millennials recall when their parents first allowed them a smartphone and finally allowed them unlimited access to the world, but more importantly social networks. Most recently, the teenagers of 2025 find themselves coming of age in an AI-first world and have known no other tech wave than this current, and most dramatic, version we currently face.
But why does AI feel categorically different, even if it is “just another exponential growth curve”? I recently spoke at the Symposium in Austin, Texas among other healthcare leaders and investors on this very trend and question. While many have commented on the pace of technological improvements and adoption, the human factor that underlies how these trends affect us has been largely left to nebulous and qualitative descriptors. But can we quantify how the pace of the AI wave feels, and why it doesn’t feel the same for everyone? I believe the answer lies in the interaction of the pace of engineering improvements and the nature of human perception together. Change isn’t experienced in teraflops per second—it’s lived in proportion to our age, our dependence, and our fear of being left behind as we enter a new phase of human history. As we define growth in finance by the compound annual growth rate (CAGR), I believe we can define the impact of technological advances in subjective terms for any individual: the human perceived growth number (HPGN).

Age is More than Just a Number: Determinants of Perceived Change
1. Performance Improvement Rate – raw doubling times (e.g., AI-compute estimates hover around a doubling time of 6 months currently).
2. Adoption Curve – this is necessarily sigmoidal, or ‘S-shaped’ and is experienced mostly in relation to our peers. Therefore, since we are focused on perception, using the domestic adoption rate for one’s country is more appropriate than considering trends on a global scale since we compare ourselves most naturally to those around us.
3. Observer Age Relative to Pace of Trend – a six-month doubling time is 1/86th of my life at 43, but 1/146th of an 80-year-old’s. This is highly meaningful in how we perceive time and the underpinning of why weeks, months, pass in a way that we perceive to be progressively faster as we age.
4. Overwhelm Factor – This part is more difficult to quantify but must include: 1.) Prior waves lived through (TV, internet, smartphones). 2.) Dependence on the Technology: TV was (and is) optional, whereas internet, smartphones, and, increasingly, AI are not. We can’t order food at many restaurants without scanning a QR code. 3.) Fear factor: when a person’s survival, relevance, or livelihood depend on the tool, the pace of change feels sharper.
Modeling the Human Perceived Growth Number (HPGN) Mathematically
Unlike growth rates, which can be infinite, the HPGN is more functionally defined as a number as it conveys a subjective human experience, which necessarily has a more clearly finite range. We can formalize a quantified representation of the perception of tech improvement with an equation that blends growth, adoption, and age.
Note: If Greek letters or exponents are triggering, you can skip to the interpretation of various HPGN values.
Let:
- Td_eff = effective doubling time (performance + adoption)
- a = age of observer (years)
- w = number of prior waves experienced (with weighting for dependence)
- β = exponent scaling the strength of age’s effect
- κ = calibration constant
Then:
HPGN(a, Tᵈᵉᶠᶠ, w) = min(1, (1 - e^(-κ·(a/Tᵈᵉᶠᶠ)ᵝ)) · (1 + 0.1w))
Interpretation of HPGN
Values range from 0 (imperceptible) to 1 (practically instantaneous).
- HPGN=0.2–0.4: change is rapid but manageable for most
- HPGN=0.6–0.8: change feels unsettlingly fast and disorienting; overwhelm is expected - HPGN=0.9–1.0: change saturates perception and feels infinite; resignation is natural

Based on this model and pace of AI improvement (~0.5-year doubling time) the HPGN values would range from 0.498 (teen) to 0.841 (age 43) to 0.975 (age 80).
Accounting for Age-Related Cognitive Decline
Abundant data confirm that cognition declines with age. While the pace may not be universal, the trend is. Cognitive processing speed, reasoning, and certain memory domains begin to decline in midlife and then accelerate significantly beyond our 60’s. While the rate of decline can vary dramatically between individuals, on average, the trend is undeniable. This means that, for HPGN values where perceived speed becomes saturated, individuals also frustratingly will often begin to experience noticeable changes in cognition.
This reality, especially to those in whom this is their lived experience, is far from academic—it’s practical. The octogenarians who are now left no alternative but to navigate telehealth portals and chatbot-based appointment scheduling are forced to do so while both the math (HPGN) and the biology (cognitive decline) synergistically conspire to produce a disorienting and justifiably frightening (yet unavoidable) new reality.

The Impact of Medical Complexity
Medical complexity is an unexpected, but unavoidable reality of interaction with the healthcare system. I’ve seen this unfold countless times. Patients and doctors alike consistently underestimate the exponential impact of each additional test, healthcare visit, and prescription. With multimorbidity and polypharmacy, complexity is not additive or linear, it compounds. For example, with five active medications, potential drug–drug interactions number ten; with ten, there are forty-five. Complexity balloons faster than cognition can compensate. This is currently a huge unmet need that AI is well-poised to address within medicine, as an aside. But that’s a topic for another day.

In the face of failing health in all its various forms, older adults face existential risks even without the threat of iatrogenesis–the harms from medical care itself–which inevitably also compound with increasing complexity. This not only increases perceived risks, but actual risks. This creates a ‘perfect storm’ phenomenon whereby those facing the greatest existential risks are also those most likely to be reduced to helplessness by the pace of technological advancement.
Ageism and the Overlooked Value of Wisdom
In the face of these trends, we run a large societal risk of equating “falling behind the pace of progress” with irrelevance. If we take the HPGN as an absolute truth, ageist biases loom large. A 59-year-old may have an HPGN in a range that suggests they may be incapable of functioning in a tech-driven system, for instance. But this lens fails to account for crystallized intelligence—the accumulated wisdom that Arthur Brooks argues (ref. From Strength to Strength) that rises with experience, as our raw cognitive processing abilities fade. Older adults bring the value of pattern recognition, lived historical perspectives, and the emergent capacity for discernment that arises from these. They’ve witnessed prior waves of technology unfold amongst the backdrop of the culture of the era and know the social choreography associated with disruption. If we sideline or disregard those with crystallized intelligence, our collective risk of falling victim to the myriad hazards baked into unbridled tech adoption will balloon.
Perceived Pace, Perceived Risk, and Performance
The perception of the pace of change is inseparable from perception of risk:- Older adults face existential risk—survival, independence, health.- Middle-aged adults face professional risk—roles redefined at breakneck pace.- Younger adults and teens face lifetime risk—living entire careers under the specter of being replaced by technology. Paradoxically, we don’t always act faster or more conscientiously when perceived risks are higher. In fact, the converse is more often the case. We deliberate, hesitate, and over-analyze in the face of high perceived risk. I see this in myself when caring for patients in urgent care and the emergency department. The more uncertainty I feel, the more tempted I am to delay making a decision to discharge a patient in favor of ordering another test and reassessing the situation later. While this is often clinically appropriate, in industries where competitive advantage exists only among organizations who are able to make rapid decisions about the integration of new technologies (most certainly including healthcare) excessive deliberation becomes its own liability.
The Vicious Cycle of Perceived Risk:
1.) Fast change feels risky.
2.) Risk slows decisions.
3.) Slower decisions deepen disadvantage in fast systems.

Take-Home Message on Interactions between Perceptions of Change & Risk
Perception of the pace of change converges as we reach the theoretical limits of our biology. Based on the pace of human cognitive development, it is reasonable to assume that those under age 10 be excluded as candidates for adopting new technology for societally meaningful use. At this limit, a doubling time of 5.2 weeks would represent a limit beyond which no real-world user of the technology would feel capable in keeping up with the pace of change. AI-compute has been estimated to double in timeframes as low as three months already–a figure within the same range of the 5.2 week/36 day theoretical max. It is also likely that other factors of human biology and culture (e.g., length of workdays, need for sleep, holidays, communication, supply chain logistics etc.) will also influence the functional limit for doubling-time.
Conversely, perceived risk diverges. For older adults, the risk is survival. For midlife professionals, the risk is identity and livelihood. For the young, the risk is lifetime employability and meaning.Yet across all groups, the paradox holds: fear of change and sentiments of overwhelm and nihilism can slow the very decisions needed to adapt to it. And with the pace of change being where it is, we can scarcely afford to be reduced to feelings of helplessness for long. The challenge therefore becomes not technological improvement but, as it always is, managing the human factors of perceptions and emotions. If we can help people recalibrate—valuing both mental agility and wisdom—there remains hope that we can avoid the paralysis that arises when we feel overwhelmed.
____________
In short, AI feels different not because it is faster in absolute terms, but because it lands differently in perceived risk landscapes across generations. While we all face a personal HPGN that incrementally creeps upward with each passing moment, we must resist the urge to view this number as deterministic in isolation.
As this most recent tech wave crashes down on us, ‘winning’ the future depends not only on our nimbleness to adapt to the increasing pace of change but equally on our ability to manage the fears associated with the various perceived risks.
Regardless of your age, the pace of change will only feel faster and more overwhelming in the future. So, our best chance to ride each tech wave for as long as possible becomes consciously building habits starting today that involve leaning into progress–especially when it feels most uncomfortable. This is (and will continue to be) a distinct competitive advantage because those with intentional and conscious approaches are always in the minority. While challenging, trying to keep up will remain infinitely more achievable than trying to catch up.
(*For ease and speed of production reasons, the above figures were generated with LLM assistance. They have been reviewed for accuracy, but may contain aesthetically awkward visual formatting quirks)