AI Is everywhere since October 7, from the battlefield to the cyber arena

Rapid integration of AI into defense, cybersecurity, and mass communication has been fast-forwarded since October 7.

 UNIT 8200 soldiers in action – working with data. (photo credit: IDF SPOKESPERSON'S UNIT)
UNIT 8200 soldiers in action – working with data.
(photo credit: IDF SPOKESPERSON'S UNIT)

Although artificial intelligence has been used in warfare before, the war in Gaza that followed the events of October 7 spawned what many claim to be the First AI War. This war shaped battlefield tactics, intelligence gathering, cyber defense, and public opinion.

For more stories from The Media Line go to themedialine.org

Previous conflicts, such as the US-led campaigns in Iraq and Afghanistan, incorporated AI in limited capacities, primarily for intelligence analysis and drone operations. However, the current war has seen AI deeply embedded in nearly every aspect of the conflict. 

From creating compelling fake media material to inflame public opinions to writing algorithms beyond human ability to be used in sensors, AI added a new layer in all aspects related to wars, providing different advantages that highlight the asymmetry of the fight between a regular army and terrorist combatants. So, how exactly is AI being used, and what does its rapid integration mean for the future of armed and societal conflicts?

AI and Strategy

Boaz Levy, CEO of Israel Aerospace Industries (IAI), spoke to The Media Line and emphasized that on the battlefield, “AI-driven algorithms have played a crucial role in identifying threats, analyzing terrain, and optimizing responses in real-time.” 

 A computer keyboard lit by a displayed cyber code is seen in this illustration picture taken on March 1, 2017. (credit: REUTERS/KACPER PEMPEL/ILLUSTRATION)
A computer keyboard lit by a displayed cyber code is seen in this illustration picture taken on March 1, 2017. (credit: REUTERS/KACPER PEMPEL/ILLUSTRATION)

Levy detailed that “AI has been instrumental in locating tunnel entrances used by Hamas and refining radar and interceptor capabilities because it enhances radar software, improving detection and tracking capabilities. It plays a crucial role in the homing phase in Israeli interceptors, ensuring greater accuracy in engaging threats,” he noted.

While AI enhances Israel’s ability to detect airborne and surface threats, Hamas’ extensive tunnel network, used for smuggling, launching attacks, and evading surveillance, requires a different approach.

Doron Feldman, a PhD candidate at TAU and a researcher of national cybersecurity strategies at the Yuval Ne'eman Workshop, highlighted the use of AI to detect and neutralize the vast tunnel network spanning hundreds of miles that Hamas built beneath Gaza. “AI-assisted technologies have been employed to map, analyze, and locate underground passages, aiding in the destruction of these tunnels, which have been used for smuggling, military operations, and as strategic hideouts.”

Modern warfare is increasingly defined by the ability to process and analyze vast amounts of information in real time, and “to ensure that soldiers and decision-makers receive precise, real-time intelligence, AI helps construct a situational picture from large datasets and identify targets within that information. AI-driven algorithms enhance combat awareness by integrating data from sensors across space, air, sea, and land, utilizing various wavelengths to create a comprehensive and accurate operational picture,” added the CEO of IAI.

Considering that fighting in Gaza happens in densely populated areas, “the IDF has also utilized facial recognition technology to identify potential terrorists and, in doing so, likely aimed to minimize harm to civilians. This has been verified by credible sources, as well as media reports,” he added. 

Throughout the war, the IDF has been making efforts “to minimize harm to non-combatants in various ways, with technology being one of the tools that enables more precise targeting of military objectives and terrorists. If there are civilian casualties, they are not the result of a deliberate policy but rather a consequence of Hamas’s cynical exploitation of Gaza’s residents,” Feldman noted. “Additionally, it is reasonable to assume that AI has been employed in efforts to locate Israeli hostages held in Gaza.”


Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


In opposition to the IDF's use of AI, human rights organizations argue that this led to abuses, particularly in its targeting processes during the war in Gaza, claiming that AI systems, which reportedly assist in identifying targets, led to a high number of civilian casualties because of flaws in the algorithms and insufficient human oversight. However, the IDF maintains that AI enhances its operational efficiency and precision, ultimately reducing unintended harm by improving intelligence analysis. Among these organizations are the United Nations, the Human Rights Watch, and others.

Physical war

However, AI isn’t just optimizing sensors on the battlefield; it is also seen side-by-side with soldiers storming Hamas’ tunnels or buildings. XTEND is a defense technology company that specializes in developing autonomous drone systems designed to enhance military operations. Its AI-powered drones allow soldiers to perform reconnaissance, target threats, and carry out complex maneuvers in tight areas without exposing themselves to direct danger. XTEND works with the IDF, the US Army, the Department of Defense, and Special Operations Command. 

Rubi Liani, CTO and co-founder of XTEND, shared with The Media Line that their drones use AI-based software, enabling soldiers with little to no drone-piloting experience to perform tasks and accomplish missions. “Drones need to operate in confined spaces, such as disaster sites, collapsed buildings, or underground tunnels. AI helps with automatic target recognition, mission execution despite jamming, and ensuring the last mile of an operation is completed even if communication is lost.”

According to XTEND’s CTO, the main challenge with AI is decision-making, which is not fully mature. “Our approach allows human operators to focus on the mission rather than piloting. The system removes cognitive overload by enabling the operator to direct the drone, such as pointing at a window they want to navigate through, without manually controlling flight manoeuvres. Also, operators receive dozens of drones per system, controlling them as swarms rather than individual units.” According to Xtend, they delivered hundreds of drones and systems in the past year.

Another example of AI being used directly by soldiers is its recent integration into Israel’s Merkava 4 tanks, which “enhanced its defensive and offensive capabilities,” Feldman detailed. It increases the crew’s situational awareness and enables them to operate and engage targets from within the tank, thereby reducing exposure to battlefield risks and minimizing casualties.” 

Cybersecurity

Beyond physical warfare, AI has also emerged as a powerful tool in cybersecurity. Sergey Shykevich, Threat Intelligence Group Manager at Check Point, highlighted how AI is being used to counter the surge in cyber threats against Israel since October 7. “Since the war began, Israel has faced a significant increase in cyberattacks—an average of 1,673 per organization per week, marking a 44% increase from 2023,” Shykevich revealed. “However, most of these attacks were not highly sophisticated. While the volume of attacks was unprecedented, the actual technical complexity of most of them remained relatively low.”

Considering the relative disparity of physical power between the IDF and terrorists in Gaza, they leverage the use of AI technologies to launch frequent cyber attacks against Israel. Meanwhile, Israel has also been using AI for cyber defense. As Shykevich pointed out, “AI is a double-edged sword; both cyber defenders and attackers are racing to integrate it into their operations.”

This dynamic battleground of cyber warfare illustrates how AI has become an essential tool, as during the war, there were even reports that Israel used the Cyber Dome system, “which includes artificial intelligence capabilities, to actively counter and mitigate numerous cyber threats and attacks, including on critical infrastructure, originating from Iran and its proxies since the beginning of the war,” Feldman added.

AI has been crucial in filtering malicious traffic, automating security responses, and strengthening Israel’s cybersecurity. However, Shykevich pointed out that it has not yet led to the development of new cyber threats. “AI has primarily made existing attack methods more efficient rather than introducing entirely new cyber threats,” he explained.

Hearts and minds 

Also, in the digital realm of the Israeli-Palestinian conflict, anti-Israel actors have been using AI to inflame and radicalize public opinion. Dr. Ron Schleifer, a psychological warfare analyst at Ariel University, explained to The Media Line how Hamas has weaponized AI-generated content for psychological warfare. "Militarily, Hamas is no match for the IDF. Their strategy, therefore, relies on psychological warfare. What matters is not the territory captured but the image left behind," he explained.

On October 7, Hamas operatives wore body-mounted cameras and filmed their atrocities “with the specific purpose of producing high-definition material for use in their propaganda, which is also getting more sophisticated,” Dr. Schleifer pointed out. 

As modern combat is increasingly defined by how it is visually documented and presented, "instead of seeing a battle from a single perspective, we now see it from multiple perspectives, multiplying the amount of visual information exponentially," he said. "The critical question then becomes: Who uses this battle footage, and how is it used to advance political goals?

In the Israeli-Palestinian conflict, Hamas' use of images from October 7 was meant to fracture the willingness of the Israeli society to fight. “Hamas understands it cannot achieve a military victory, so it jumps straight to the political battlefield. Psychological warfare is their primary weapon—hence their focus on kidnappings and using Israeli society’s internal divisions against itself.”

In this sense, Israel faces a disadvantage. As a democratic state with an orderly military, it must verify all information before releasing it. Hamas, on the other hand, is not bound by such constraints and can publish anything without delay. “AI exacerbates this imbalance. AI can generate fake news, deepfake videos, memes, and emotionally charged visual content—all of which are powerful psychological weapons,” noted Dr. Schleifer.

Currently, there is no binding international agreement governing AI’s use in warfare. While the US, EU, and Israel have debated ethical AI principles, adversaries, including terror groups and rogue states, face no such constraints. This imbalance poses a long-term strategic risk that policymakers must urgently address, but it won’t decelerate the adoption of AI technologies in conflicts.

Boaz Levy from IAI anticipates that “AI will continue to play an increasing role in military operations. We will see the development of more advanced algorithms that optimize decision-making, improving defense and offensive capabilities.”

Still, experts believe AI technologies will continue to depend on human operators. While AI is undoubtedly transforming various aspects of security and defense, Feldman believes AI is still far from revolutionizing the battlefield in the foreseeable future. “Countries and militaries that successfully integrate AI into their operations will likely gain a strategic edge over their adversaries, but AI remains in its early stages of battlefield application and cannot yet replace the core functions of human soldiers.”

Israel must maintain high levels of readiness and operational preparedness, especially in light of the intelligence failure of October 7. While AI can enhance threat detection, decision-making, and operational efficiency, it is not a substitute for human judgment and resilience. 

As warfare evolves, will AI give a disproportionate advantage to well-funded militaries, creating an even more significant power imbalance between nations and non-state actors? Could AI-enabled misinformation campaigns become a standard tool for influencing international politics? These and other questions remain unanswered, but the ongoing war in Gaza demonstrated that AI is not just a tool of the future but is already reshaping how wars are fought and won.