Neszed-Mobile-header-logo
Thursday, November 13, 2025
Newszed-Header-Logo
HomeGadgetsNEC’s AI Driving Diagnosis: When Video AI and LLM Meet the Road

NEC’s AI Driving Diagnosis: When Video AI and LLM Meet the Road

At CEATEC 2025, NEC unveiled an excellent example of how generative AI can improve real-world safety. Its AI Driving Diagnosis system, demonstrated inside the company’s booth, turns ordinary dashcam footage into an intelligent conversation about how we drive—and how we could drive better.

NEC’s AI Driving Diagnosis: When Video AI and LLM Meet the Road

The concept might sound like another driver-monitoring gadget, but NEC’s approach is quite different. By combining its video recognition AI with a large language model (LLM), the system does more than detect patterns: it understands them. It interprets the context of driving behavior—whether a sudden acceleration, a risky lane change, or a near-miss—and explains what happened in human terms, complete with advice to prevent future accidents.

From Simulator to Service: Driving Diagnosis for Insurance and Fleet Management

During the NEC demo at CEATEC, Ubergizmo co-founder Hubert Nguyen sat at a simulator equipped with a steering wheel, pedals, and multiple screens replicating real-life road conditions. Within minutes, NEC’s AI analyzed the dashcam and sensor data—speed, acceleration, and GPS—and generated a concise driving diagnosis report.

NEC AI driving analysis 02

The system assessed each maneuver, identifying abrupt braking, uneven acceleration, or smooth turns, and produced a summary that could be shared with insurers, fleet managers, or municipal transport agencies. According to NEC representatives, the same engine can generate spoken feedback for real-time coaching or automatically deliver written reports to telematics platforms.

Far from being a consumer gadget, the technology is designed as a B2B solution for risk analysis, fleet safety programs, and usage-based insurance, helping organizations understand driver behavior while reducing fuel costs and accident rates.

Turning Video into Understanding

NEC AI driving analysis 05

The intelligence behind this demo comes from NEC’s descriptive video summarization technology, which can be metaphorically compared to “a video version of ChatGPT.

Traditional computer vision systems can recognize objects or track motion, but they rarely understand why something happens. NEC’s system uses a combination of computer vision and LLM reasoning to describe and contextualize what the video shows. It extracts the moments most relevant to a user’s purpose and generates a short, fact-based narrative about them—transforming raw video into actionable insight.

To achieve that, NEC integrates over one hundred visual recognition engines—covering object detection, human pose estimation, vehicle tracking, and environmental context—on a unified platform. The AI converts detected visual elements into structured data stored in a proprietary “graph-based multimedia database.” This design grounds every generated explanation in verifiable facts, minimizing the hallucination issues that generative models sometimes produce.

In practice, it means the system can condense ten minutes of driving footage into a brief but precise explanation of what the driver did right, what was risky, and how to improve.

Prompt Engineering Meets the Road

NEC AI driving analysis 03 T scaled

NEC researchers described three main challenges in bringing this idea to life:

  1. Understanding the user’s intent – whether a fleet manager wants safety metrics or an insurer wants behavioral scoring.
  2. Comprehending complex visual context – reading the relationship between vehicles, roads, and conditions.
  3. Generating accurate, natural explanations that match what actually occurred.

According to NEC’s Visual Intelligence Laboratory, LLMs were essential to solving those first and third problems. The company’s prompt engineers designed instructions that guide the model toward precise, concise summaries. One engineer explained that splitting complex commands into smaller segments improved both accuracy and consistency—an approach that made development move faster and output more reliable.

The result is a system that communicates clearly in human language: “Your deceleration before intersections is abrupt; easing off earlier would improve safety and fuel efficiency.” Feedback like that is far easier to interpret than a generic warning light.

Linking Driving Behavior with Network Quality

NEC’s AI Driving Diagnosis is part of a broader effort to build safe-mobility infrastructure supported by multimodal AI. Earlier in 2024, the company introduced a Quality of Experience (QoE) prediction system for connected vehicles, capable of forecasting which mobile network or base station will provide the most stable communication for each car or drone in motion.

That technology also uses the same hybrid of video recognition and LLM reasoning to interpret environmental factors—such as traffic congestion, building density, or weather—and recommend optimal network handovers. Together, these systems form a continuous feedback loop:

  • Video AI evaluates how drivers behave.
  • QoE prediction evaluates where they can drive safely and efficiently.
  • The LLM ties both dimensions together, explaining why a change matters.

This convergence positions NEC as one of the few companies linking driving behavior, connectivity quality, and AI-based coaching under one unified technological framework.

Beyond the Dashboard: A Broader B2B Vision

NEC AI driving analysis 04NEC envisions multiple verticals for this technology. Local governments can deploy it to monitor public-transport fleets, ensuring consistent driver performance and reducing accident claims. Logistics companies can use it to track delivery-truck’s smoothness, lowering fuel consumption. Insurance providers can integrate AI assessments into telematics products to dynamically adjust risk profiles.

The company has already commercialized related “drive record analysis” services in Japan and is now in discussion with fleet operators, municipal agencies, and insurance carriers for joint pilot programs. Because the system runs securely on-premise or within private clouds, it can handle sensitive video data while maintaining compliance with strict privacy standards.

Why It Matters

Driver-behavior analytics is not new—dashcams and telematics boxes have been scoring smoothness and reaction times for years. But those systems usually stop at numbers and alerts. NEC’s approach moves one step further by understanding context and explaining cause and effect in natural language.

That shift turns data into coaching. It transforms risk analysis from a reactive process into an ongoing conversation between humans and machines, where AI can motivate safer habits before a crash occurs.

For insurers, it means a smarter feedback loop and potentially lower claim costs. For fleet managers, it means objective, explainable performance metrics for hundreds of drivers at once. For NEC, it demonstrates how generative AI—when grounded in factual recognition—can move from the cloud into operational, real-world mobility systems.

Toward a Safer, Smarter Mobility Ecosystem

The NEC demo at CEATEC 2025 was short, but its implications are broad. By merging its expertise in computer vision, network optimization, and generative AI, NEC is building the foundation of a safe-mobility ecosystem—one that not only records how we drive but also helps us drive better.

If current trials with insurance and fleet partners prove successful, the next wave of connected-vehicle services might go beyond tracking our trips. They could soon explain them—turning every drive into an intelligent feedback session, powered by NEC’s video-aware, language-driven AI.

Filed in General. Read more about AI (Artificial Intelligence), CEATEC, CEATEC 2025, Driving, Japan and Nec.

Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments