The most direct consequences
Very powerful technologies with potential for dangerous accidents being pursued recklessly. Such an accident, capable of killing many or all humans is called an existential risk.
Arms-race and recklessness: Environments where safety (or accident-prevention) is of low concern are commonly found (and created) within current (not necessarily voluntarily created) systems. The most notable are: Capitalist competitiveness: Pursuit of (long-term and short-term) profits to ensure survival (of the capitalist entity in question) (and elimination of (possible) threats). International competitiveness. Both scenarios are what is called an arms-race, a situation where two (or more) parties attempt to eclipse each other in the same pursuit, usually the eclipsement having higher priority than safety (or accident-prevention). Arms-races can have (possible) threats (arising from them) mitigated through mutually-agreed-upon limitations and desistment. (threat) Mitigation-agreements are rarer when the parties (in question) have low inter-party communication and dislike of each other (for other reasons).
Over-enthusiasm: Enthusiasm(s) (not of involuntary pressure) of technological pursuits can lead to lowered concern for safety. These can be caused by: Over-optimism: An irrational amount of optimism caused notably by lack of information (and positive speculation instead). Promise of increased happiness and survival sometimes during desperation: Resource-holders can seek increased happiness and survival without care for consequences regarding people in general (once humans are obsolete) and recklessly pursue solutions if values are threatened, for example medicine if ill.
Influential death-wanters: Some people want to kill people. In a system where influence can be gained by death-wanters, intentional "technological accidents" can happen. The most notable death-wanters are those expecting to die soon, most notable of those being suicidal people.
Increase of resources: More resources become available as time passes (due to commonly known reasons), notably those needed for technological accidents. This can notably allow death-wanters to gain more resources.
Nuclear weapons: Explosives capable of physically compromising (destroying) large areas and releasing radioactive powder (fallout) known to reduce lifespan. Currently found within submarines and facilities spread throughout the earth, totalling enough explosive power to be classified as an existential risk. Many nuclear-weapon-related-'machines and humans' are currently designed (or responsible) to send the nuclear weapon (held by each individual machine in question) to a destination to explode automatically if an incoming nuclear weapon is detected. This has prevented any recent launches of nuclear weapons towards populated areas for fear of mutually assured (complete) destruction (and other significant losses resulting from nuclear war).
Gray goo: Nanoscopic entities designed to eat (nearly) everything and reproduce rapidly. Added methods of transport are commonly considered. If capable of consumption, movement, and reproduction at a rate faster than human society can (meaningfully) respond, material in the area in question (earth) will be compromised to a degree incapable of harboring life and human civilization. Gray goo is theoretically not very difficult to make.
Hostile superintelligence: An entity wanting to harm humans intelligent enough to outwit human society. Theorized to happen due to: Fast self-learning entity (somehow) concluding humans should be compromised. Humans giving a superintelligence instructions (wrongly) causing it to compromise humans. A superintelligence having an error causing it to compromise humans.
Engineered virus: (usually small) Entity designed to (usually) enter and compromise things (notably humans and computers). Regularly designed in nature due to natural selection (notably to enter humans). Intelligent design must create more effective viruses than natural selection for this to have significance. Not very difficult to do.
Skepticism into returnism: People easily afraid, loss-averse, and already skeptical of technology might turn further to returnism (over futurism).
Merit to gradualism: Higher (knowledge of) potential for (negative) accidents merits gradualismRegulated (slow) method of pursuing technology (over accelerationism) (in the context of futurism).
Rise of centralization: Increased possibility of (negative sometimes intentional) accidents is more possible in decentralized systems due to: Lower regulation and more dispersed resources increasing available environments for an accident to occur. Lower concern over resource-holder profiles, allowing death-wanting resource-holders.
Increased unity and pre-emptive activism: Attempts at regulation (to prevent accidents) requires agreement (of resource-holders). When fears over technological accidents increase, resource-holders can be affected by the fear into regulating pursuits (of technology), while they and other influential entities (humans) can pressureUse of threats to influence decision-making (in this context an embargo of resources (labor) and pressure on entities capable of punishment (of the resource-holders) (the non-regulating resource-holders).