Technology ("science of craft", from Greek τέχνη, techne, "art, skill, cunning of hand"; and -λογία, -logia[2]) is the sum of techniques, skills, methods, and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, and the like, or it can be embedded in machines to allow for operation without detailed knowledge of their workings. Systems (e.g. machines) applying technology by taking an input, changing it according to the system's use, and then producing an outcome are referred to as technology systems or technological systems.
The simplest form of technology is the development and use of basic tools. The prehistoric invention of shaped stone tools followed by the discovery of how to control fire increased sources of food. The later Neolithic Revolution extended this, and quadrupled the sustenance available from a territory. The invention of the wheel helped humans to travel in and control their environment.
Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.
Technology has many effects. It has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth's environment. Innovations have always influenced the values of a society and raised new questions in the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.
Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.
The use of the term "technology" has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and it was used either to refer to the description or study of the useful arts[3] or to allude to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).[4]
The term "technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term's meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology." In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as "technology." By the 1930s, "technology" referred not only to the study of the industrial arts but to the industrial arts themselves.[5]
In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."[6] Bain's definition remains common among scholars today, especially social scientists. Scientists and engineers usually prefer to define technology as applied science, rather than as the things that people make and use.[7] More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work on technologies of the self (techniques de soi).
Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Learner's Dictionary offers a definition of the term: "the use of science in industry, engineering, etc., to invent useful things or to solve problems" and "a machine, piece of equipment, method, etc., that is created by technology."[8]Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here."[9] The term is often used to imply a specific field of technology, or to refer to high technology or just consumer electronics, rather than technology as a whole.[10]Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life," and as "organized inorganic matter."[11]
Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology.[12]W. Brian Arthur defines technology in a similarly broad way as "a means to fulfill a human purpose."[13]
The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology," it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.
Technology can be viewed as an activity that forms or changes culture.[14] Additionally, technology is the application of mathematics, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and as a result has helped spawn new subcultures; the rise of cyberculture has at its basis the development of the Internet and the computer.[15] As a cultural activity, technology predates both science and engineering, each of which formalize some aspects of technological endeavor.
The distinction between science, engineering, and technology is not always clear. Science is systematic knowledge of the physical or material world gained through observation and experimentation.[16] Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability, and safety.[17]
Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.
Technology is often a consequence of science and engineering, although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists[disambiguation needed]; the three fields are often considered as one for the purposes of research and reference.[18]
The exact relations between science and technology, in particular, have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, it was widely considered in the United States that technology was simply "applied science" and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush's treatise on postwar science policy, Science – The Endless Frontier: "New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature ... This essential new knowledge can be obtained only through basic scientific research."[19] In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious, though most analysts resist the model that technology is a result of scientific research.[20][21]
The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal,[22] with a brain mass approximately one third of modern humans.[23] Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.[24]
Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 75,000 years ago,[25]pressure flaking provided a way to make much finer work.
The discovery and use of fire, a simple energy source with many profound uses, was a turning point in the technological evolution of humankind.[26] The exact date of its discovery is not known; evidence of burnt animal bones at the Cradle of Humankind suggests that the domestication of fire occurred before 1 Ma;[27] scholarly consensus indicates that Homo erectus had controlled fire by between 500 and 400 ka.[28][29] Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.[30]
Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 ka, humans were constructing temporary wood huts.[31][32] Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200 ka and into other continents such as Eurasia.[33]
Human's technological ascent began in earnest in what is known as the Neolithic Period ("New Stone Age"). The invention of polished stone axes was a major advance that allowed forest clearance on a large scale to create farms. This use of polished stone axes increased greatly in the Neolithic, but were originally used in the preceding Mesolithic in some areas such as Ireland.[34]Agriculture fed larger populations, and the transition to sedentism allowed simultaneously raising more children, as infants no longer needed to be carried, as nomadic ones must. Additionally, children could contribute labor to the raising of crops more readily than they could to the hunter-gatherer economy.[35][36]
With this increase in population and availability of labor came an increase in labor specialization.[37] What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.[38]
Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge gold, copper, silver, and lead – native metals found in relatively pure form in nature.[39] The advantages of copper tools over stone, bone, and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka).[40] Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4000 BCE). The first uses of iron alloys such as steel dates to around 1800 BCE.[41][42]
Meanwhile, humans were learning to harness other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to the 8th-millennium BCE.[43] From prehistoric times, Egyptians probably used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and "catch" basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.[44]
According to archaeologists, the wheel was invented around 4000 BCE probably independently and nearly simultaneously in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture) and Central Europe.[45] Estimates on when this may have occurred range from 5500 to 3000 BCE with most experts putting it closer to 4000 BCE.[46] The oldest artifacts with drawings depicting wheeled carts date from about 3500 BCE;[47] however, the wheel may have been in use for millennia before these drawings were made. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana marshes of Slovenia.[48]
The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used the potter's wheel and may have invented it.[49] A stone pottery wheel found in the city-state of Ur dates to around 3429 BCE,[50] and even older fragments of wheel-thrown pottery have been found in the same area.[50] Fast (rotary) potters' wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois[51] and were first used in Mesopotamia and Iran in around 3000 BCE.[51]
The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to circa 4000 BCE[52] and timber roads leading through the swamps of Glastonbury, England, dating to around the same time period.[52] The first long-distance road, which came into use around 3500 BCE,[52] spanned 1,500 miles from the Persian Gulf to the Mediterranean Sea,[52] but was not paved and was only partially maintained.[52] In around 2000 BCE, the Minoans on the Greek island of Crete built a fifty-kilometer (thirty-mile) road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island.[52] Unlike the earlier road, the Minoan road was completely paved.[52]
Ancient Minoan private homes had running water.[54] A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos.[54][55] Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain.[54] The ancient Romans had many public flush toilets,[55] which emptied into an extensive sewage system.[55] The primary sewer in Rome was the Cloaca Maxima;[55] construction began on it in the sixth century BCE and it is still in use today.[55]
The ancient Romans also had a complex system of aqueducts,[53] which were used to transport water across long distances.[53] The first Roman aqueduct was built in 312 BCE.[53] The eleventh and final ancient Roman aqueduct was built in 226 CE.[53] Put together, the Roman aqueducts extended over 450 kilometers,[53] but less than seventy kilometers of this was above ground and supported by arches.[53]
Innovations continued through the Middle Ages with innovations such as silk-manufacture (introduced into Europe after centuries of development in Asia), the horse collar and horseshoes in the first few hundred years after the 5th-century fall of the Roman Empire. Medieval technology saw the use of simple machines (such as the lever, the screw, and the pulley) being combined to form more complicated tools, such as the wheelbarrow, windmills and clocks, and a system of universities developed and spread scientific ideas and practices. The Renaissance era produced many innovations, including the printing press (which facilitated the communication of knowledge), and technology became increasingly associated with science, beginning a cycle of mutual advancement. Advances in technology in this era allowed a more reliable supply of food, followed by the wider availability of consumer goods.
Starting in the United Kingdom in the 18th century, the Industrial Revolution was a period of great technological discovery, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, driven by the discovery of steam power and the widespread application of the factory system. Technology took another step in a second industrial revolution (c. 1870 to c. 1914) with the harnessing of electricity to allow such innovations as the electric motor, light bulb, and countless others. Scientific advances and the discovery of new concepts later allowed for powered flight and developments in medicine, chemistry, physics, and engineering. The rise in technology has led to skyscrapers and broad urban areas whose inhabitants rely on motors to transport them and their food supplies. Communication improved with the invention of the telegraph, telephone, radio and television. The late-19th and early-20th centuries saw a revolution in transportation with the invention of the airplane and automobile.
The 20th century brought a host of innovations. In physics, the discovery of nuclear fission has led to both nuclear weapons and nuclear power. Computers were invented and later miniaturized using transistors and integrated circuits. Information technology subsequently led to the birth in the 1980s of the Internet, which ushered in the current Information Age. Humans started to explore space with satellites (late 1950s, later used for telecommunication) and in manned missions (1960s) going all the way to the moon. In medicine, this era brought innovations such as open-heart surgery and later stem-cell therapy along with new medications and treatments.
Complex manufacturing and construction techniques and organizations are needed to make and maintain some of the newer technologies, and entire industries have arisen to support and develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education – their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have developed to support them, including engineering, medicine, and computer science; and other fields have become more complex, such as construction, transportation, and architecture.
Generally, technicism is the belief in the utility of technology for improving human societies.[56] Taken to an extreme, technicism "reflects a fundamental attitude which seeks to control reality, to resolve all problems with the use of scientific–technological methods and tools."[57] In other words, human beings will someday be able to master all problems and possibly even control the future using technology. Some, such as Stephen V. Monsma,[58] connect these ideas to the abdication of religion as a higher moral authority.
Optimistic assumptions are made by proponents of ideologies such as transhumanism and singularitarianism, which view technological development as generally having beneficial effects for the society and the human condition. In these ideologies, technological development is morally good.
Transhumanists generally believe that the point of technology is to overcome barriers, and that what we commonly refer to as the human condition is just another barrier to be surpassed.
Singularitarians believe in some sort of "accelerating change"; that the rate of technological progress accelerates as we obtain more technology, and that this will culminate in a "Singularity" after artificial general intelligence is invented in which progress is nearly infinite; hence the term. Estimates for the date of this Singularity vary,[59] but prominent futurist Ray Kurzweil estimates the Singularity will occur in 2045.
Kurzweil is also known for his history of the universe in six epochs: (1) the physical/chemical epoch, (2) the life epoch, (3) the human/brain epoch, (4) the technology epoch, (5) the artificial intelligence epoch, and (6) the universal colonization epoch. Going from one epoch to the next is a Singularity in its own right, and a period of speeding up precedes it. Each epoch takes a shorter time, which means the whole history of the universe is one giant Singularity event.[60]
Some critics see these ideologies as examples of scientism and techno-utopianism and fear the notion of human enhancement and technological singularity which they support. Some have described Karl Marx as a techno-optimist.[61]
On the somewhat skeptical side are certain philosophers like Herbert Marcuse and John Zerzan, who believe that technological societies are inherently flawed. They suggest that the inevitable result of such a society is to become evermore technological at the cost of freedom and psychological health.
Many, such as the Luddites and prominent philosopher Martin Heidegger, hold serious, although not entirely, deterministic reservations about technology (see "The Question Concerning Technology"[62]). According to Heidegger scholars Hubert Dreyfus and Charles Spinosa, "Heidegger does not oppose technology. He hopes to reveal the essence of technology in a way that 'in no way confines us to a stultified compulsion to push on blindly with technology or, what comes to the same thing, to rebel helplessly against it.' Indeed, he promises that 'when we once open ourselves expressly to the essence of technology, we find ourselves unexpectedly taken into a freeing claim.'[63] What this entails is a more complex relationship to technology than either techno-optimists or techno-pessimists tend to allow."[64]
Some of the most poignant criticisms of technology are found in what are now considered to be dystopian literary classics such as Aldous Huxley's Brave New World, Anthony Burgess's A Clockwork Orange, and George Orwell's Nineteen Eighty-Four. In Goethe's Faust, Faust selling his soul to the devil in return for power over the physical world is also often interpreted as a metaphor for the adoption of industrial technology. More recently, modern works of science fiction such as those by Philip K. Dick and William Gibson and films such as Blade Runner and Ghost in the Shell project highly ambivalent or cautionary attitudes toward technology's impact on human society and identity.
The late cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called "technopolies," societies that are dominated by the ideology of technological and scientific progress to the exclusion or harm of other cultural practices, values, and world-views.[65]
Darin Barney has written about technology's impact on practices of citizenship and democratic culture, suggesting that technology can be construed as (1) an object of political debate, (2) a means or medium of discussion, and (3) a setting for democratic deliberation and citizenship. As a setting for democratic culture, Barney suggests that technology tends to make ethical questions, including the question of what a good life consists in, nearly impossible because they already give an answer to the question: a good life is one that includes the use of more and more technology.[66]
Nikolas Kompridis has also written about the dangers of new technology, such as genetic engineering, nanotechnology, synthetic biology, and robotics. He warns that these technologies introduce unprecedented new challenges to human beings, including the possibility of the permanent alteration of our biological nature. These concerns are shared by other philosophers, scientists and public intellectuals who have written about similar issues (e.g. Francis Fukuyama, Jürgen Habermas, William Joy, and Michael Sandel).[67]
Another prominent critic of technology is Hubert Dreyfus, who has published books such as On the Internet and What Computers Still Can't Do.
A more infamous anti-technological treatise is Industrial Society and Its Future, written by the Unabomber Ted Kaczynski and printed in several major newspapers (and later books) as part of an effort to end his bombing campaign of the techno-industrial infrastructure. There are also subcultures that disapprove of some or most technology, such as self-identified off-gridders.[68]
The notion of appropriate technology was developed in the 20th century by thinkers such as E.F. Schumacher and Jacques Ellul to describe situations where it was not desirable to use very new technologies or those that required access to some centralized infrastructure or parts or skills imported from elsewhere. The ecovillage movement emerged in part due to this concern.
This section mainly focuses on American concerns even if it can reasonably be generalized to other Western countries.
The inadequate quantity and quality of American jobs is one of the most fundamental economic challenges we face. [...] What's the linkage between technology and this fundamental problem?
— Bernstein, Jared, "It’s Not a Skills Gap That’s Holding Wages Down: It’s the Weak Economy, Among Other Things," in The American Prospect, October 2014
In his article, Jared Bernstein, a Senior Fellow at the Center on Budget and Policy Priorities,[69] questions the widespread idea that automation, and more broadly, technological advances, have mainly contributed to this growing labor market problem. His thesis appears to be a third way between optimism and skepticism. Essentially, he stands for a neutral approach of the linkage between technology and American issues concerning unemployment and declining wages.
He uses two main arguments to defend his point. First, because of recent technological advances, an increasing number of workers are losing their jobs. Yet, scientific evidence fails to clearly demonstrate that technology has displaced so many workers that it has created more problems than it has solved. Indeed, automation threatens repetitive jobs but higher-end jobs are still necessary because they complement technology and manual jobs that "requires flexibility judgment and common sense"[70] remain hard to replace with machines. Second, studies have not shown clear links between recent technology advances and the wage trends of the last decades.
Therefore, according to Bernstein, instead of focusing on technology and its hypothetical influences on current American increasing unemployment and declining wages, one needs to worry more about "bad policy that fails to offset the imbalances in demand, trade, income, and opportunity."[70]
Thomas P. Hughes stated that because technology has been considered as a key way to solve problems, we need to be aware of its complex and varied characters to use it more efficiently.[71] What is the difference between a wheel or a compass and cooking machines such as an oven or a gas stove? Can we consider all of them, only a part of them, or none of them as technologies?
Technology is often considered too narrowly; according to Hughes, "Technology is a creative process involving human ingenuity".[72] This definition's emphasis on creativity avoids unbounded definitions that may mistakenly include cooking "technologies," but it also highlights the prominent role of humans and therefore their responsibilities for the use of complex technological systems.
Yet, because technology is everywhere and has dramatically changed landscapes and societies, Hughes argues that engineers, scientists, and managers have often believed that they can use technology to shape the world as they want. They have often supposed that technology is easily controllable and this assumption has to be thoroughly questioned.[71] For instance, Evgeny Morozov particularly challenges two concepts: "Internet-centrism" and "solutionism."[73] Internet-centrism refers to the idea that our society is convinced that the Internet is one of the most stable and coherent forces. Solutionism is the ideology that every social issue can be solved thanks to technology and especially thanks to the internet. In fact, technology intrinsically contains uncertainties and limitations. According to Alexis Madrigal's review of Morozov's theory, to ignore it will lead to "unexpected consequences that could eventually cause more damage than the problems they seek to address."[74] Benjamin R. Cohen and Gwen Ottinger also discussed the multivalent effects of technology.[75]
Therefore, recognition of the limitations of technology, and more broadly, scientific knowledge, is needed – especially in cases dealing with environmental justice and health issues. Ottinger continues this reasoning and argues that the ongoing recognition of the limitations of scientific knowledge goes hand in hand with scientists and engineers’ new comprehension of their role. Such an approach of technology and science "[require] technical professionals to conceive of their roles in the process differently. [They have to consider themselves as] collaborators in research and problem solving rather than simply providers of information and technical solutions."[76]
The use of basic technology is also a feature of other animal species apart from humans. These include primates such as chimpanzees,[77] some dolphin communities,[78] and crows.[79][80] Considering a more generic perspective of technology as ethology of active environmental conditioning and control, we can also refer to animal examples such as beavers and their dams, or bees and their honeycombs.
The ability to make and use tools was once considered a defining characteristic of the genus Homo.[81] However, the discovery of tool construction among chimpanzees and related primates has discarded the notion of the use of technology as unique to humans. For example, researchers have observed wild chimpanzees using tools for foraging: some of the tools used include leaf sponges, termite fishing probes, pestles and levers.[82]West African chimpanzees also use stone hammers and anvils for cracking nuts,[83] as do capuchin monkeys of Boa Vista, Brazil.[84]
Theories of technology often attempt to predict the future of technology based on the high technology and science of the time. As with all predictions of the future, however, technology is uncertain.
In 2005, futurist Ray Kurzweil predicted that the future of technology would mainly consist of an overlapping "GNR Revolution" of genetics, nanotechnology and robotics, with robotics being the most important of the three.[85] This future revolution has been explored in films, novels, and video games, which have predicted the creation of many inventions, as well as foreseeing future events. Such inventions and events include a government-controlled simulation that resulted from massive robotics advancements, (The Matrix), a society that has rid itself of procreation due to improvements in genetic engineering (Brave New World), and a police state enforced by the government using datamining, nanobots, and drones (Watch Dogs). Humans have already made some of the first steps toward achieving the GNR revolution.
Recent discoveries and ingenuity has allowed us to create robotics in the form of Artificial Intelligence, as well as in the physical form of robots. Artificial intelligence has been used for a variety of purposes, including personal assistants in a smart phone, the first of which was Siri, released in the iPhone 4s in 2011 by Apple.[86] Some believe that the future of robotics will involve a 'greater than human non-biological intelligence.'[87] This concept can be compared to that of a 'rogue AI,' an Artificial Intelligence that has gained self-awareness, and tries to eradicate humanity. Others believe that the future will involve AI servants creating an easy and effortless life for humankind, where robots have become the primary work force. This future shares many similarities with the concept of planned obsolescence, however, planned obsolescence is seen as a "sinister business strategy.'[88] Man-controlled robots such as drones have been developed to carry out tasks such as bomb defusal and space exploration. Universities such as Harvard are working towards the invention of autonomous robots to be used in situations that would aid humans, such as surgery robots, search and rescue robots, and physical therapy robots.[89]
Genetics have also been explored, with humans understanding genetic engineering to a certain degree. However, gene editing is widely divisive, and usually involves some degree of eugenics. Some have speculated the future of human engineering to include 'super humans,' humans who have been genetically engineered to be faster, stronger, and more survivable than current humans. Others think that genetic engineering will be used to make humans more resistant or completely immune to some diseases.[90] Some even suggest that 'cloning,' the process of creating an exact copy of a human, may be possible through genetic engineering.
Some believe that within the next 10 years, humans will discover nanobot technology, while others believe that we are centuries away from its invention. It is believed by futurists that nanobot technology will allow humans to 'manipulate matter at the molecular and atomic scale.' This discovery could pave the way for many scientific and medical advancements, such as curing new diseases, or inventing new, more efficient technology. It is also believed that nanobots could be injected or otherwise inserted inside the human body, and replace certain parts, keeping humans healthy for an incredibly long amount of time, or combating organ failure to a degree.
The 'GNR revolution,' would bring a new age of technology and advancement for humanity like none that has been seen before.
Theories and concepts in technology
Economics of technology
Technology journalism
Other
By: Wikipedia.org
Edited: 2021-06-18 12:36:55
Source: Wikipedia.org