Development in Progress
The concept of progress is at the heart of humanity’s story. | Jul 16, 2024
Deindustrialization has made it impossible for America to economically support a stable urbanized working class, resulting in the social and physical decay of cities that is only partially ameliorated by more recent urban gentrification and the accompanying service sector jobs. Fresh thinking is needed in everything from education to architecture to city planning to transportation, but it is so far lacking.
The process of industrialization in the United States began to take off around the time of the Civil War. The rapid growth of the railroad across the nation, coupled with the increasing surpluses provided by mechanized agriculture, provided a base for the development of industry and the resultant rise of cities and the mass urban labor force. From around 1840 to the early 1950s, industry consistently grew as a portion of the American labor force, interrupted only by the Great Depression[1]. To accommodate this industry, a wave of city building followed, adding growth to the old trading cities of the East such as Philadelphia and New York, and leading to the rise of new interior cities like Detroit, Chicago, and St. Louis. This process continued through the late 19th and early 20th centuries, stalling in the decade or so following the Second World War with the widespread growth of suburbs. City building in America has largely been in decline ever since.
The contours of the development of the US industrial base can be seen by tracing the development of the American machine tool industry. Machine tools are a good proxy for the overall robustness of an industrial ecosystem, as they form the base technical layer for much of heavy industry.[2] In the United States, production of machine tools was scaled up during the Civil War, and by the 1880s high precision machine tools were being mass produced.[3] Mass production continued to scale up through the Second World War, and technological innovation in the industry continued through the first decades of the Cold War—but by the 1970s the industry in the U.S. was stagnant.[4] Competitors in Germany, Japan and China dominated. For the purposes of this investigation, this developmental course can be read as a rough periodization of the rise and fall of American industrial capacity in general.
The United States became majority urban sometime between 1910 and 1920—but American urbanization unfolded unevenly across space and time, generally spreading on the coattails of industry from the Northeast, then to the Midwest, then to the West, and finally to the South. Rail coverage followed this general pattern of spatial expansion, with mileage tripling between 1860 and 1880, and again between 1880 and 1920.[5] The first states to become majority urban were Massachusetts and Rhode Island around 1850, and the Northeast as a whole became majority urban by 1880.[6] Both the Midwest and the West became majority urban in the 1910s. As for the South, that region did not become majority urban until the 1950s, following its belated postwar industrialization (and the concomitant advent of air conditioning).7]
In American cities, flourishing industries provided masses of American workers with social mobility on a scale never before seen.
In American cities, flourishing industries provided masses of American workers with social mobility on a scale never before seen. After World War II, deals negotiated between labor and management institutionalized collective bargaining within the industrial sector, giving workers an unprecedented degree of economic security. Walter Reuther’s Treaty of Detroit, negotiated in 1949 between the United Auto Workers union and General Motors, is a typical example of these arrangements. In exchange for protecting GM from annual strikes, it gave workers health coverage, retirement benefits, cost-of-living adjustments for wages, and guaranteed vacation time.[8] For millions of Americans at this time, the American dream was in fact achievable.
To a large extent, this system of political economy subsidized many other parts of city life. It supported entire local business ecosystems, with small businesses catering to local workers. It supported a strong local tax base, which went to funding public infrastructure such as schools, parks, firefighting, and policing. And most importantly, it supported a dense and complex network of social capital, which allowed families, groups of friends, religious institutions, and voluntary institutions all to flourish and mediate community life. The industrial city was bustling and vibrant—for example, in 1941, a streetcar ran every sixty seconds through the heart of Detroit.[9]
This era is often romanticized, but its high level of social stability is indeed visible by certain measures: the homicide rate, for example, fell to its lowest-ever rate in the industrial era in 1935 as the system of political economy matured under New Deal reforms, and stayed there until the mid-1960s, when the dividends of deindustrialization started to come due en masse in America’s cities.[10]
Industrial capital also drove the physical development of America’s cities to unforeseen heights, literally, financing the likes of the Chrysler Building even during the Great Depression. Every major U.S. city now has a cluster, sometimes a forest, of skyscrapers housing mostly service industries. Beyond cultural icons, though, the United States used its industrial surplus to invest in its infrastructure at the highest rate ever in its history, allowing cities to enjoy a well-functioning physical environment.[11] In fact, America has not really built a city in this fashion since the beginning of deindustrialization. Newer cities such as Orlando or Las Vegas grew largely due to suburban sprawl—not organized around urban life in any communal sense—with large commuter populations who spend the vast majority of their lives outside of downtown.
The American urban landscape during peak industrialization, around the year 1950, reveals the context in which deindustrialization occurred. According to the 1950 Census, the top ten most populous cities in the U.S. at that time were, in order: New York, Chicago, Philadelphia, Los Angeles, Detroit, Baltimore, Cleveland, St. Louis, Washington D.C., and Boston. Since the previous census in 1940, all of these cities had registered consistent growth. The order of this list had shuffled around a bit in the intervening ten years, but its contents remained unchanged except for the replacement of Pittsburgh by Washington D.C., which had grown rapidly with the size of the federal government during the war.[12]
This 1950 census is interesting as a snapshot of the American urban environment at its height, on the cusp of decline. Of these cities, all except Washington D.C. had grown on the back of industrialization. What’s more, the population of each of these cities save for Los Angeles and New York peaked in this census and as of 2020 have never come close to their 1950s-era populations (New York for its part began a decades-long decline but eventually recovered its earlier numbers). In all of the cities that witnessed population decline besides Washington D.C., deindustrialization played a large part in this process.
The general trend of American deindustrialization is evident in basic national labor data. Total manufacturing jobs in total peaked in late 1943 at the height of World War II, then fell back to prewar levels as the country demobilized from war, and then continuously rising from there until the early 1970s, after which job growth stagnated and finally peaked in 1979, when it entered a slow decline lasting until 2000. At that point, the decline became precipitous, with the country losing nearly 36% of its manufacturing jobs during the decade between 2000 to 2010. Since then, aggregate manufacturing jobs have seen some growth, though the 2020 shock from the COVID-19 pandemic brought them back down to 2010 levels.
Deindustrialization has unfolded very unevenly. As industry in America was a heavily regional—and often even a city-by-city—mode of economic organization, American deindustrialization was a regional process as well. It occurred at a level of granularity not captured by the national analysis above. Some areas of New England, for example, began to deindustrialize as early as the 1920s following the collapse of the local industrial base after World War I; whereas some areas of the South did not begin to lose factories until the mid-2000s.[13]
This has been a very heterogeneous process. We can identify multiple distinct phases, such as the deindustrialization of cities like Detroit, St. Louis, and New York beginning in the mid-to-late 1950s; and the Federal Reserve policy of the period 1979-1984, when interest rates were raised to quash inflation. Among other things this led to the rise in international value of the dollar, eviscerating the American machine tool industry in the face of Japanese competition; and leading more recently to the deindustrialization of California’s low- and high-tech manufacturing industries starting in the 1980s and 1990s.[14] The most recent phase has consisted of the most extreme decimation of manufacturing jobs in American history, which occurred between 2000 and 2010 thanks to such convergent blows as NAFTA, the global automotive industry crisis, and the “China shock”—the miscalculation that China would remain a mostly low-wage, low-skilled labor economy after U.S.-China trade relations were normalized in 2000.[15]
[...] deindustrialization directly undermines the economic sustenance of entire communities, which in the absence of other economic opportunities naturally leads to impoverishment.
The societal effects of deindustrialization are in principle similar everywhere, regardless of local conditions. In areas that rely on industry, deindustrialization directly undermines the economic sustenance of entire communities, which in the absence of other economic opportunities naturally leads to impoverishment. This often leads to a vicious circle: the loss of homes, the loss of pensions, a reduction in the local tax base, a reduction in local public services provision, a loss of resources for local institutions, and physical urban decay. Moreover, deindustrialization in the United States almost always means a concomitant loss of employee healthcare benefits, as healthcare in America is very often tied to employment.[16]
In addition to pulling the material base out from under a particular community, deindustrialization also deprives workers of access to the stable social role of “having a steady job” and participating in a public sphere of others with the same role. The strong social fabric that comes with an industrial sector is often irreplaceable by relatively precarious service sector work. This material and social erosion then begins to work its way up the deindustrialized community’s entire social and institutional stack: in addition to poverty and mass unemployment, deindustrialization has been shown to bring with it the collapse of social capital, and the collapse of social capital has in turn been demonstrated to lead to family breakdown, loss of public trust, the collapse of local institutions, and rising rates of violence, drug abuse, depression, and suicide.[17]
In many cases in American history, deindustrialization has also stoked racial tension. Accelerated industrialization across the North in the first half of the 20th century induced millions of African-Americans to move out of the rural South and into northern cities, an event known as the Great Migration. They found mass employment in new industrial jobs—although their conditions were often poor relative to white workers.[18] The stresses of deindustrialization have contributed to race riots, such as those of 1967 Detroit, and even long-term breakdowns of public order, a condition that can be found to this day in large swathes of many post-industrial cities.[19]
The material and social collapse heralded by deindustrialization, coupled with intense racial conflicts, often led to the wholesale abandonment of urban centers. The most notorious example of this is White Flight, the mass abandonment of certain city neighborhoods by whites starting in the 1950s and 1960s, but often those with the means to leave collapsed urban centers will do so, regardless of race (see for example the continuous outflow of Baltimore’s black middle class). [20] Urban breakdown caused by the loss of industry has induced urban depopulation, and urban depopulation in turn further deprives cities of resources, leading to a feedback loop from which it is nearly impossible to escape under conditions of economic hollowing-out.
The extent of this hollowing-out is best shown by example, and examples are unfortunately easy to find. Most apparent is the degree of urban depopulation. Detroit is the only American city ever to grow above one million people and then contract back to below that number, having lost over 60% of its people since 1950—a time span during which the national population has nearly doubled.[21] St. Louis lost 64% of its population, Baltimore lost nearly 40%, and Chicago lost over a quarter—all mostly to the new suburban developments springing up across the country. The vicious cycle of urban decay and increased suburbanization also often hits city finances particularly hard, with the most notorious example being New York City’s near-bankruptcy and resulting bailout from the federal government in 1975.[22]
Such massive shifts aside, the severity of deindustrialization is best grasped at the local level. During an infamous event known as “Black Monday,” on September 19, 1977, in the city of Youngstown, Ohio, the steel manufacturer Youngstown Sheet & Tube announced the termination of 5,000 manufacturing jobs, effective immediately. Youngstown was heavily dependent on this industry for its basic existence as a center of economic and social life, and Black Monday heralded the city’s deindustrialization. In the few years after Black Monday, the city lost 40,000 manufacturing jobs, 400 businesses which the industrial sector had supported, over $400 million in personal income, and in some localities up to 75% of school tax revenue.[23] Since then the city has lost nearly 60% of its population, and has not economically recovered.
This is not an isolated or temporary trend. In a study on the long-term nature of deindustrialization, researchers at Youngstown State University note that the effects of deindustrialization have been so pervasive as to permanently alter the characteristics of unemployment nationwide.[24] In 1979, they note, 8.6% of the unemployed had been out of work for more than six months, while in 2005, this figure had more than doubled, coming in at 19.6%. Post-industrial impoverishment, too, is a long-term condition. The authors note that a few years after the major factory closures in Gary, Indiana, one fifth of that city’s households lived in poverty. Today, over a third of Gary residents remain impoverished.[25]
The dynamics of urban collapse precipitated by deindustrialization are most infamously associated with Rust Belt cities such as Youngstown, but as deindustrialization has become a near universal trend across the American urban landscape, so too have its effects become universalized, as the Youngstown State researchers crucially noted. For example, they write that in 2009, the recently deindustrialized city of El Centro, California had an unemployment rate of 24.5%, comparable to the 25% rate recorded in Youngstown in 1980.
American cities today are plagued with the effects of deindustrialization, with the working classes bearing the brunt of the decay across the board—in terms of poverty, long-term unemployment, social dysfunction, and crime. These trends have proven remarkably resilient over time; the American urban landscape has never truly recovered from deindustrialization. Nearly every major city in America is littered with the vestiges of long-gone communities, with blocks of vacant lots, entire neighborhoods in disrepair, and abandoned schools, community centers, and churches.
On the surface, though, census data shows that the United States has been continuously urbanizing over its entire history, and by 2010 had reached an urban population of 80%, with the remaining 20% rural. The urban population in 1950, by contrast, only accounted for 64% of the population.[26] So how can we say that the American city has long been in decline?
This data is somewhat misleading, since the census makes no distinction between urban and suburban, meaning that the vast majority of suburbs in America are recorded as “urban” areas (the only other choice being “rural”).[27] And just as industrialization often led to urbanization, so too did deindustrialization often lead to suburbanization. Mass population outflow to suburbia gathered steam in the mid-1950s and continues steadily to this day—by 2000, half of the country was suburban.[28] Suburbanization has created sprawling new areas of settlement in places like Phoenix and Houston which, though recorded on the census as “urban,” are nearly entirely suburban. This can be seen by measuring today’s urban landscape against the peak industrial urbanism of 1950. According to the 1950 Census, the twenty largest cities in America at that time held almost 20% of the country’s population. By 2006, that number had fallen to 10%.[29] In other words, America’s largest cities are half as large relative to the rest of the country today as they were in 1950.[30]
It can be argued that deindustrialization is not a problem since, thanks to the workings of the market, laid-off factory workers will be able to find work in a new sector better suited to the comparative advantage of their local area (in the American discourse, such new opportunities are often described as service sector or high-tech jobs).[31] Besides, freeways and mass transit make commuting relatively simple and at least until COVID, many knowledge workers could dine, shop and be entertained in the city. Unfortunately, empirical reality has not been so frictionless. The service sector, especially in a blighted city, has no use for the large, coordinated workforces of the factory floor, and can only employ a much smaller number of workers relative to industry. The work it does provide is often precarious, with low pay and few benefits.[32] Thus, with the disappearance of industry, disappears the conditions for a mass working class concentrated in cities. These communities largely dissolved—either moving out to less affluent suburbs or scraping by in miserably blighted inner cities—since the new conditions of political economy no longer supported them. Overly simplistic economic models of free trade, which posited seamless transition between sectors, did not account for this rapid immiseration.[33]
Furthermore, according to a 2013 meta-analysis in the Journal of Urban Affairs, every Rust Belt city that has retained economic and social stability through industrial decline has done so through the retention of some portion of their industrial sector via active state industrial policy, and not through an attempted transition into the information economy, as most often seen in the mimicries of Silicon Valley culture presented in the marketing campaigns of desperate city governments.[34] What this state action has looked like, the author argues, is protection of local bases of implicit manufacturing knowledge, coordination with local business elites in order to build a reciprocal ecosystem of “satellite businesses” around the manufacturing sector, and the influence of higher-level politicians and business leaders in order to protect industry.
A rare example of these conditions taking place in America, according to the study, is Elkhart, Indiana, which protected its local tradition of recreational vehicle (RV) manufacturing and coordinated corporate and labor elites in the construction of a flourishing and diversified manufacturing sector centered around the production of RVs, trailers, trucks, and mobile homes.[35] Notably, Elkhart has seen very strong population growth in recent decades, and boasts a vibrant downtown. Unfortunately, such functional coordination is the exception rather than the norm in the United States, and we have nothing approaching such coordinated industrial policy on the national scale. [36]
The post-industrial collapse of community and urban life has occurred on a national scale, but the decline of social capital has been an uneven process, and many analyses have shown that its effects have been far worse for the working classes than they have among elites. This holds true across the board, from material prosperity to social capital to marriage rates.[37] Most visibly, recent years have seen a sharp increase in deaths of despair—that is, deaths resulting from alcoholism, drug abuse, and suicide—among the white working class, which bore the brunt of the most rapid loss of manufacturing jobs in American history, between 2000 and 2010. [38]
This disparity gets at the heart of our current political economy. Deindustrialization in America has caused a significant class sorting, separating the country into those whose livelihoods have been devastated by the loss of industry, and those who have weathered deindustrialization and thrived. The service sector has not come close to providing the urban community stability and economic security once provided by industry.[39] The most stark example of this abject lack of opportunity can be seen in the labor force participation rate, which has been continuously declining among men (who are the vast majority of industrial workers) since the start of deindustrialization, dropping from around 85% in the late 1950s to roughly 67% today.[40] The recent discourse around Universal Basic Income can be seen as another way to deal with this surplus population, rather than as a reaction to any mass automation of existing American industry. The widespread prosperity promised with the advent of the so-called “New Economy” or the “Information Economy” never in fact materialized, and the only class to truly remain intact in recent decades has been the highly-educated elite, largely composed of corporate and nongovernmental managers, civil servants, and “symbolic analyst” knowledge workers.[41] The economic safety enjoyed by this class has not extended to the working classes.[42]
We see this disparity reflected in our cities today. They are often very segregated by class, with enclaves of wealthy professionals concentrated in upscale urban areas or posh inner-ring suburbs.[43] There is often also a working-class neighborhood of both recent immigrants and native workers, who exist as service sector workers willing to cater to the market demand created by professional class elites, whether in traditional service jobs or, increasingly, sporadic gig work. And finally, there remain the blighted old industrial neighborhoods, some of which—especially in the case of urban working-class black neighborhoods in the East and Midwest—have been in a state of de facto social collapse for over sixty years.
Urban areas in the United States no longer have very much of a propertied bourgeois middle class, with that class largely relegated to small-business owners in exurban and rural areas.
As our labor force participation rate continues to sink, and our post-industrial class sorting continues apace, it is becoming increasingly difficult to make any class distinction in the American city outside of identifying winners (or more accurately, survivors) of deindustrialization, and the losers. There is largely nothing in between. Urban areas in the United States no longer have very much of a propertied bourgeois middle class, with that class largely relegated to small-business owners in exurban and rural areas.[44]
It is true that American cities have experienced something of a renaissance since the 1990s, with the re-entrance of elites into many major cities, a trend known as “gentrification”.[45] The most direct benefit of gentrification has been an increased tax base in many major American cities, particularly New York City and Washington, D.C. However, the benefits of this process have not accrued to existing post-industrial working-class communities, which despite building booms bringing new service sector businesses into their deindustrialized neighborhoods have largely remained mired in poverty.[46]
From this we can conclude that gentrification has not meaningfully altered the political economy of the deindustrialized American city. Though increased demand for urban services may have greatly increased the number of urban gig workers doing things like driving for Uber Eats, this has not led to the sectoral stabilization of the urban working class, which has remained mostly dissolved since the flight of industry. Thanks to the precarity of the service sector and the hyper-precarity of ad hoc gig economy jobs, and in the absence of other stable options for mass employment, we can expect any advancement in the condition of urban service workers will not be permanent and structural, but rather temporary and highly contingent on the levels of consumption levels of urban elites, rather than structural. Raising the national minimum wage to $15, if it happens, won’t solve the problem, either.
What’s more, the shock of the COVID-19 pandemic may have caused a resumption of the post-war American tendency towards suburbanization. The pandemic threw class dynamics into sharp relief, with increasing numbers of knowledge workers able to work remotely and considering permanent escape from the crowded city, while “essential workers” remain tied to their service jobs or deal with unemployment. Elites can choose to leave the city at any time, and as skyrocketing real estate prices in the New York City’s suburbs can attest, they may be gearing up to do so yet again.[47]
Due to the fact that the vast majority of prosperity in our post-industrial age has redounded to the elite, much of what happens in America’s cities is contingent on the behavior of this class.[48] Whether it’s in the realm of politics, the economy, or culture, the elite now wields a greater level of influence over society simply because many of the social classes below them have essentially collapsed.
If the portion of the elite that has urbanized since the 1990s once again flees the cities, the urban working classes may very well lose what little security the gentrification-era service boom provided to them, and without industry, there is nothing we can do to ensure their long-term economic well-being. We have been taught this lesson for seventy years, but we have yet to learn it. This is unsurprising, as the American elite is by its nature isolated from the immediate economic realities of deindustrialization. They do not bear the material or social costs of deindustrialization, so we should not expect to see a societal focus on these deep problems anytime soon.
The noun was coined by the American ecological psychologist James J. Gibson. It was initially used in the study of animal-environment interaction and has also been used in the study of human-technology interaction. An affordance is an available use or purpose of a thing or an entity. For example, a couch affords being sat on, a microwave button affords being pressed, and a social media platform has an affordance of letting users share with each other.
Agent provocateur translates to “inciting incident” in French. It is used to reference individuals who attempt to persuade another individual or group to partake in a crime or rash behavior or to implicate them in such acts. This is done to defame, delegitimize, or criminalize the target. For example, starting a conflict at a peaceful protest or attempting to implicate a political figure in a crime.
Ideological polarization is generated as a side-effect of content recommendation algorithms optimizing for user engagement and advertising revenues. These algorithms will upregulate content that reinforces existing views and filters out countervailing information because this has been proven to drive time on-site. The result is an increasingly polarized perspective founded on a biased information landscape.
To “cherry pick” when making an argument is to selectively present evidence that supports one’s position or desired outcome, while ignoring or omitting any contradicting evidence.
The ethical behavior exhibited by individuals in service of bettering their communities and their state, sometimes foregoing personal gain for the pursuit of a greater good for all. In contrast to other sets of moral virtues, civic virtue refers specifically to standards of behavior in the context of citizens participating in governance or civil society. What constitutes civic virtue has evolved over time and may differ across political philosophies. For example, in modern-day democracies, civic virtue includes values such as guaranteeing all citizens the right to vote, and freedom of culture, race, sex, religion, nationality, sexual orientation, or gender identity. A shared understanding of civic virtue among the populace is integral to the stability of a just political system, and waning civic virtue may result in disengagement from collective responsibilities, noncompliance with the rule of law, a breakdown in trust between individuals and the state, and degradation of the intergenerational process of passing on civic virtues.
Closed societies restrict the free exchange of information and public discourse, as well as impose top down decisions on their populus. Unlike the open communications and dissenting views that characterize open societies, closed societies promote opaque governance and prevent public opposition that might be found in free and open discourse.
A general term for collective resources in which every participant of the collective has an equal interest. Prominent examples are air, nature, culture, and the quality of our shared sensemaking basis or information commons.
The cognitive bias of 1) exclusively seeking or recalling evidence in support of one's current beliefs or values, 2) interpreting ambiguous information in favor of one’s beliefs or values, and 3) ignoring any contrary information. This bias is especially strong when the issues in question are particularly important to one's identity.
In science and history, consilience is the principle that evidence from independent, unrelated sources can “converge” on strong conclusions. That is, when multiple sources of evidence are in agreement, the conclusion can be very strong even when none of the individual sources of evidence is significantly so on its own.
While “The Enlightenment” was a specific instantiation of cultural enlightenment in 18th-century Europe, cultural enlightenment is a more general process that has occurred multiple times in history, in many different cultures. When a culture goes through a period of increasing reflectivity on itself it is undergoing cultural enlightenment. This period of reflectivity brings about the awareness required for a culture to reimagine its institutions from a new perspective. Similarly, “The Renaissance” refers to a specific period in Europe while the process of a cultural renaissance has occurred elsewhere. A cultural renaissance is more general than (and may precede) an enlightenment, as it describes a period of renewed interest in a particular topic.
A deep fake is a digitally-altered (via AI) recording of a person for the purpose of political propaganda, sexual objectification, defamation, or parody. They are progressively becoming more indistinguishable from reality to an untrained eye.
Empiricism is a philosophical theory that states that knowledge is derived from sensory experiences and relies heavily on scientific evidence to arrive at a body of truth. English philosopher John Locke proposed that rather than being born with innate ideas or principles, man’s life begins as a “blank slate” and only through his senses is he able to develop his mind and understand the world.
It is both the public spaces (e.g., town hall, Twitter) and private spaces where people come together to pursue a mutual understanding of issues critical to their society, and the collection of norms, systems, and institutions underpinning this society-wide process of learning. The epistemic commons is a public resource; these spaces and norms are available to all of us, shaped by all of us, and in turn, also influence the way in which all of us engage in learning with each other. For informed and consensual decision-making, open societies and democratic governance depend upon an epistemic commons in which groups and individuals can collectively reflect and communicate in ways that promote mutual learning.
Inadvertent emotionally or politically -motivated closed-mindedness, manifesting as certainty or overconfidence when dealing with complex indeterminate problems. Epistemic hubris can appear in many forms. For example, it is often demonstrated in the convictions of individuals influenced by highly politicized groups, it shows up in corporate or bureaucratic contexts that err towards certainty through information compression requirements, and it appears in media, where polarized rhetoric is incentivized due to its attention-grabbing effects. Note: for some kinds of problems it may be appropriate or even imperative to have a degree of confidence in one's knowledge—this is not epistemic hubris.
An ethos of learning that involves a healthy balance between confidence and openness to new ideas. It is neither hubristic, meaning overly confident or arrogant, nor nihilistic, meaning believing that nothing can be known for certain. Instead, it is a subtle orientation that seeks new learning, recognizes the limitations of one's own knowledge, and avoids absolutisms or fundamentalisms—which are rigid and unyielding beliefs that refuse to consider alternative viewpoints. Those that demonstrate epistemic humility will embrace truths where these are possible to attain but are generally inclined to continuously upgrade their beliefs with new information.
This form of nihilism is a diffuse and usually subconscious feeling that it is impossible to really know anything, because, for example, “the science is too complex” or “there is fake news everywhere.” Without a shared ability to make sense of the world as a means to inform our choices, we are left with only the game of power. Claims of “truth” are seen as unwarranted or intentional manipulations, as weaponized or not earnestly believed in.
Epistemology is the philosophical study of knowing and the nature of knowledge. It deals with questions such as “how does one know?” and “what is knowing, known, and knowledge?”. Epistemology is considered one of the four main branches of philosophy, along with ethics, logic, and metaphysics.
Derived from a Greek word meaning custom, habit, or character; The set of ideals or customs which lay the foundations around which a group of people coheres. This includes the set of values upon which a culture derives its ethical principles.
The ability of an individual or group to shape the perception of an issue or topic by setting the narrative and determining the context for the debate. A “frame” is the way in which an issue is presented or “framed”, including the language, images, assumptions, and perspectives used to describe it. Controlling the frame can give immense social and political power to the actor who uses it because the narratives created or distorted by frame control are often covertly beneficial to the specific interests of the individual or group that has established the frame. As an example, politicians advocating for tax cuts or pro-business policies may use the phrase "job creators" when referring to wealthy corporations in order to suggest their focus is on improving livelihoods, potentially influencing public perception in favor of the politician's interests.
Discourse oriented towards mutual understanding and coordinated action, with the result of increasing the faith that participants have in the value of communicating. The goal of good faith communication is not to reach a consensus, but to make it possible for all parties to change positions, learn, and continue productive, ongoing interaction.
Processes that occupy vast expanses of both time and space, defying the more traditional sense of an "object" as a thing that can be singled out. The concept, introduced by Timothy Morton, invites us to conceive of processes that are difficult to measure, always around us, globally distributed and only observed in pieces. Examples include climate change, ocean pollution, the Internet, and global nuclear armaments and related risks.
Information warfare is a primary aspect of fourth- and fifth-generation warfare. It can be thought of as war with bits and memes instead of guns and bombs. Examples of information warfare include psychological operations like disinformation, propaganda, or manufactured media, or non-kinetic interference in an enemy's communication capacity or quality.
Refers to the foundational process of education which underlies and enables societal and cultural cohesion across generations by passing down values, capacities, knowledge, and personality types.
The phenomenon of having your attention captured by emotionally triggering stimuli. These stimuli strategically target the brain center that we share with other mammals that is responsible for emotional processing and arousal—the limbic system. This strategy of activating the limbic system is deliberately exploited by online algorithmic content recommendations to stimulate increased user engagement. Two effective stimuli for achieving this effect are those that can induce disgust or rage, as these sentiments naturally produce highly salient responses in people.
An online advertising strategy in which companies create personal profiles about individual users from vast quantities of trace data left behind from their online activity. According to these psychometric profiles, companies display content that matches each user's specific interests at moments when they are most likely to be impacted by it. While traditional advertising appeals to its audience's demographics, microtargeting curates advertising for individuals and becomes increasingly personalized by analyzing new data.
False or misleading information, irrespective of the intent to mislead. Within the category of misinformation, disinformation is a term used to refer to misinformation with intent. In news media, the public generally expects a higher standard for journalistic integrity and editorial safeguards against misinformation; in this context, misinformation is often referred to as “fake news”.
A prevailing school of economic thought that emphasizes the government's role in controlling the supply of money circulating in an economy as the primary determinant of economic growth. This involves central banks using various methods of increasing or decreasing the money supply of their currency (e.g., altering interest rates).
A form of rivalry between nation-states or conflicting groups, by which tactical aims are realized through means other than direct physical violence. Examples include election meddling, blackmailing politicians, or information warfare.
Open societies promote the free exchange of information and public discourse, as well as democratic governance based on the participation of the people in shared choices about their social futures. Unlike the tight control over communications and suppression of dissenting views that characterize closed societies, open societies promote transparent governance and embrace good-faith public scrutiny.
The modern use of the term 'paradigm' was introduced by the philosopher of science Thomas Kuhn in his work "The Structure of Scientific Revolutions". Kuhn's idea is that a paradigm is the set of concepts and practices that define a scientific discipline at any particular period of time. A good example of a paradigm is behaviorism – a paradigm under which studying externally observable behavior was viewed as the only scientifically legitimate form of psychology. Kuhn also argued that science progresses by the way of "paradigm shifts," when a leading paradigm transforms into another through advances in understanding and methodology; for example, when the leading paradigm in psychology transformed from behaviorism to cognitivism, which looked at the human mind from an information processing perspective.
The theory and practice of teaching and learning, and how this process influences, and is influenced by, the social, political, and psychological development of learners.
The ability of an individual or institutional entity to deny knowing about unethical or illegal activities because there is no evidence to the contrary or no such information has been provided.
First coined by philosopher Jürgen Habermas, the term refers to the collective common spaces where people come together to publicly articulate matters of mutual interest for members of society. By extension, the related theory suggests that impartial, representative governance relies on the capacity of the public sphere to facilitate healthy debate.
The word itself is French for rebirth, and this meaning is maintained across its many purposes. The term is commonly used with reference to the European Renaissance, a period of European cultural, artistic, political, and economic renewal following the middle ages. The term can refer to other periods of great social change, such as the Bengal Renaissance (beginning in late 18th century India).
A term proposed by sociologists to characterize emergent properties of social systems after the Second World War. Risk societies are increasingly preoccupied with securing the future against widespread and unpredictable risks. Grappling with these risks differentiate risk societies from modern societies, given these risks are the byproduct of modernity’s scientific, industrial, and economic advances. This preoccupation with risk is stimulating a feedback loop and a series of changes in political, cultural, and technological aspects of society.
Sensationalism is a tactic often used in mass media and journalism in which news stories are explicitly chosen and worded to excite the greatest number of readers or viewers, typically at the expense of accuracy. This may be achieved by exaggeration, omission of facts and information, and/or deliberate obstruction of the truth to spark controversy.
A process by which people interpret information and experiences, and structure their understanding of a given domain of knowledge. It is the basis of decision-making: our interpretation of events will inform the rationale for what we do next. As we make sense of the world and accordingly act within it, we also gather feedback that allows us to improve our sensemaking and our capacity to learn. Sensemaking can occur at an individual level through interaction with one’s environment, collectively among groups engaged in discussion, or through socially-distributed reasoning in public discourse.
A theory stating that individuals are willing to sacrifice some of their freedom and agree to state authority under certain legal rules, in exchange for the protection of their remaining rights, provided the rest of society adheres to the same rules of engagement. This model of political philosophy originated during the Age of Enlightenment from theorists including, but not limited to John Locke, Thomas Hobbes, and Jean-Jacques Rousseau. It was revived in the 20th century by John Rawls and is used as the basis for modern democratic theory.
Autopoiesis from the Greek αὐτo- (auto-) 'self', and ποίησις (poiesis) 'creation, production'—is a term coined in biology that refers to a system’s capability for reproducing and maintaining itself by metabolizing energy to create its own parts, and eventually new emergent components. All living systems are autopoietic. Societal Autopoiesis is an extension of the biological term, making reference to the process by which a society maintains its capacity to perpetuate and adapt while experiencing relative continuity of shared identity.
A fake online persona, crafted to manipulate public opinion without implicating the account creator—the puppeteer. These fabricated identities can be wielded by anyone, from independent citizens to political organizations and information warfare operatives, with the aim of advancing their chosen agenda. Sock puppet personas can embody any identity their puppeteers want, and a single individual can create and operate numerous accounts. Combined with computational technology such as AI-generated text or automation scripts, propagandists can mimic multiple seemingly legitimate voices to create the illusion of organic popular trends within the public discourse.
Presenting the argument of disagreeable others in their weakest forms, and after dismissing those, claiming to have discredited their position as a whole.
A worldview that holds technology, specifically developed by private corporations, as the primary driver of civilizational progress. For evidence of its success, adherents point to the consistent global progress in reducing metrics like child mortality and poverty while capitalism has been the dominant economic paradigm. However, the market incentives driving this progress have also resulted in new, sometimes greater, societal problems as externalities.
Used as part of propaganda or advertising campaigns, these are brief, highly-reductive, and definitive-sounding phrases that stop further questioning of ideas. Often used in contexts in which social approval requires unreflective use of the cliché, which can result in confusion at the individual and collective level. Examples include all advertising jingles and catchphrases, and certain political slogans.
A proposition or a state of affairs is impossible to be verified, or proven to be true. A further distinction is that a state of affairs can be unverifiable at this time, for example, due to constraints in our technical capacity, or a state of affairs can be unverifiable in principle, which means that there is no possible way to verify the claim.
Creating the image of an anti-hero who epitomizes the worst of the disagreeable group, and contrasts with the best qualities of one's own, then characterizing all members of the other group as if they were identical to that image.
Discussion
Thank you for being part of the Consilience Project Community.
0 Comments