Mike is the CEO and Co-Founder of the artificial intelligence and weapons detection company, Zero Eyes. Listen to the podcast to hear Mike’s journey from growing up in Philly, to the US Navy SEAL Teams, to entrepreneur.
AI & Security Podcast with guest Kieran Carroll
AI Weapons Detection from Zero Eyes
Listen to the podast here.
Every Second Matters – AI Gun Detection Video Analytics Emergency Response
See the full article HERE.
Death tolls resulting from mass shooting events in the US have gone up in recent years. One of the crucial challenges in coping with these attacks is response time. In fact, the average response time in a mass shooting event in the US is between 12 and 15 minutes, that’s long enough to cause a lot of destruction. A new video analytics solution would curb that time frame by detecting weapons in real time, keeping schools and other public spaces safe.
Using artificial intelligence, ZeroEyes’ gun detection technology can predict potential threats and prevent mass shooter situations from escalating. It can be integrated with analog and digital security cameras already installed in public areas.
The AI gun detection software is able to detect assault rifles, semiautomatic pistols and shotguns. It will only detect a weapon if it is visible. It detects an approaching threat through existing security cameras. Every camera has a real-time streaming protocol (RTSP) data path.
The DeepZero analytic platform runs over two parts: pre-processing and inferencing using motion detection to capture 10 frames, or 10 opportunities per second to find a gun. Inferencing determines if a gun is present. Every keyframe image goes through the ZeroEyes dataset.
An alert of a weapon goes to the monitoring team or onsite security team, and if a true weapon is detected, an alert is sent to a local emergency dispatch – this process takes three to five seconds and bypasses the traditional dispatch process.
The software integrates with existing satellite mapping of buildings. As a shooter passes a camera, the map will light up. This allows first responders to know the precise location of a threat. The mapping system can prevent a shooter from entering a building by locking the doors.
The system is currently on site and operational in schools and other buildings in the US, and in the coming months, they will install the system into 20 new schools and a major skyscraper.
Two to three false positives are received daily by the company over more than 50 cameras, but clients will receive only the alert of a real threat, according to technical.ly.
Interested in learning more about video analytics solutions? Attend i-HLS’s InnoTech Expo in Tel Aviv – Israel’s largest innovation, HLS, and cyber technologies expo – on November 18-19, 2020 at Expo Tel Aviv, Pavilion 2.
See more articles with ZeroEyes in the News.
ZeroEyes uses AI to identify guns in mass shooter situations. Here’s how the tech works
See the original article here.
A mockup of ZeroEyes’ platform.
In 2019, over 400 mass shootings were reported in the U.S.
The average response time in these situations was between 12 and 15 minutes, according to ZeroEyes, a video analytics company based at the Pennovation Center in Grays Ferry.
That’s long enough to cause a lot of destruction. ZeroEyes aims to curb that: The company uses artificial intelligence to predict potential threats and prevent mass shooter situations from escalating. The technology, which can be integrated with analog and digital security cameras already installed in public areas, detects weapons in real time.
“The cofounders were just sick of seeing headlines where kids got killed or people in an office building got killed,” said Kieran Carroll, ZeroEyes’ VP of enterprise strategy and government affairs. “When we were in the military, we would have loved to have a tool like this.”
Kieran Carroll. (Courtesy photo)
Mike Lahiff and Rob Huberty, both former Navy SEALs who received their MBAs from The Wharton School, started the company two years ago. They combined their academic and military backgrounds with other former SEALS to figure out a better solution to mass shootings.
ZeroEyes is a part of the Bunker Labs network, which is dedicated to helping those in the military community launch their own businesses. It also participated in Dreamit Venture’s fall 2019 SecureTech Program for physical security startups and was nominated this month as a top tech startup for the 2020 PACT Awards.
ZeroEyes currently has 16 full-time employees, nine of whom are former military. Carroll said its tech team does not have any formally trained data scientists with an AI background. Instead, its team is entirely self-taught through a trial-and-error process that involved testing the technology at shopping malls, offices parks and schools. From there, a comprehensive data set with thousands of images was constructed.
The AI software is able to detect assault rifles (AR and AK style weapons), semiautomatic pistols and shotguns. It will only detect a weapon if it is visible.
Here’s the breakdown of how the ZeroEyes software works:
ZeroEyes detects an approaching threat through existing security cameras. Every camera has a real-time streaming protocol (RTSP) data path.
ZeroEyes’ analytic platform, DeepZero, runs over two parts: pre-processing and inferencing. Pre-processing uses motion detection to capture 10 frames, or 10 opportunities per second to find a gun. Inferencing determines if a gun is present. Every key frame image goes through the ZeroEyes dataset.
Next, the alert of a weapon goes to the monitoring team or onsite security team.
If a true weapon is detected, an alert is sent to a local emergency dispatch (such as a 911 call center), onsite security staff, police and school administrators. This process takes three to five seconds and bypasses the traditional dispatch process.
The software integrates with existing satellite mapping of buildings. As a shooter passes a camera, the map will light up. This allows first responders to know the precise location of a threat. The mapping system can prevent a shooter from entering a building by locking the doors.
“With our system, you can essentially have a person sitting behind every single camera,” Carroll said. “This provides very clear situational awareness and allows people to make quicker decisions which ultimately shaves off time.”
When ZeroEyes ran active shooter drills at Rancocas Valley Regional High School in Burlington Country, New Jersey, where the system is currently operating, the team saw a 50% reduction in response times.
ZeroEyes receives two to three false positives per day over more than 50 cameras, but clients will never get false positives, only the alert of a real threat.
Clients “won’t even have to think about [a shooting] until it happens,” Carroll said.
The system is currently on site and operational in schools and other buildings in New Jersey, New York, Pennsylvania, Wisconsin and D.C. In the coming months, Carroll says it will be installed into 20 more schools. The system will also soon be installed in a “major Chicago skyscraper.”
As ZeroEyes expands to more schools and hires more people, it aims to the best and the fastest in the weapons detection industry.
“In the next 10 years, in every public place, there needs to be a weapons detection capability in that building,” Carroll said.
P.S. Check out the Philadelphia Shooting Victims Dashboard, which shows data for every gun shot victim in the city for the last five years.
Verizon courts public safety with 5G, Super Bowl ad
See the original article HERE.
Not one to let its lead in the public safety market slip to rival AT&T, which won the FirstNet contract in 2017, Verizon is courting public safety with the latest and greatest in technologies that use 5G.
Granted, 5G is not yet widely deployed—Verizon this week just announced its 5G Ultra Wideband is now live in parts of Little Rock, Arkansas; Kansas City, Missouri; and Cincinnati, Ohio. So Verizon is using 5G first responder labs to work with startups and innovators around the country that are developing leading-edge solutions. This past week, five startups participated in a graduation of sorts at an event in San Francisco. Previous graduations have been held in New York City and Washington, D.C.
Verizon works with the startups for 12 weeks. One of the startups at this week’s event, Lumineye, developed wall-penetrating radar sensing to help first responders identify people through walls. Another one, Zeroeyes, offers an active shooter type of solution that uses artificial intelligence (AI) to actively monitor camera feeds to detect weapons.
Verizon has featured 5G-related firefighting applications in TV commercials leading up to the Super Bowl, where that theme will continue this weekend. There are myriad reasons for running the commercials, but part of it boils down to a desire to let people know that 5G isn’t just for consumer and enterprise segments.
“What we want to do is make sure that we continue to innovate and develop our network to be better for first responders. 5G is absolutely a part of that,” said Nick Nilan, director of Product Development, Public Sector, at Verizon. “We want to make sure that we’re developing 5G solutions for our first responders at the same time.”
A technology from Qwake is featured in a commercial that shows how firefighters can use it to see through smoke. Older technologies in the market can do something similar with thermal imaging, but that’s bulky and this allows the firefighter to keep their hands free and see what’s going on, Nilan said. The technology also has to work with the firefighter’s existing helmet and other gear.
Verizon’s Ultra Wideband 5G service uses millimeter wave spectrum. Engineers have learned a lot about how millimeter wave works and how devices connect to it. “We’re continuing to learn about how to better deploy the networks, but we’re pretty confident that we will have the right networks in place to be able to power these solutions for first responders that rely on it,” he told Fierce. “It’s definitely a technological challenge that we’re overcoming.”
It’s worth noting that these solutions for public safety also need to work over existing 4G LTE networks.
RELATED: Verizon admits mistake in throttling firefighters’ LTE speeds
Even though states opted into FirstNet, that doesn’t force every public safety, firefighting or EMS agency to opt in, and that’s obviously important to Verizon, which still serves the majority of first responders in the U.S., according to Nilan.
“We believe every first responder/public safety agency should have the ability to choose the best network and best network partner for them,” he added.
In fact, Verizon doesn’t believe every first responder will be on the same network nationwide. Some of them get their cell phone service through their agency and others bring their personal device. It depends on what works best for them in a given geography.
RELATED: AT&T, Verizon spar over public safety interoperability
What’s important is that the networks work together. When the plan for FirstNet was drafted after September 11, 2001, there was a lack of interoperability between networks. “Now, in 2020, we have the advantage of interoperable standards built into our cellular networks,” he said.
The industry has solved, through standards, a lot of the interoperability issues between networks as well as between LMR networks and cellular. “We want to make sure that continues,” he said.
Verizon Bringing 5G to First Responders
See the original article HERE.
Verizon Wireless believes that 5G, the next-generation cellular communication, won’t just allow smartphones to download movies faster, but it will also improve emergency service response times and combat crime.
Last week, Verizon completed the roll out of its third cohort for its 5G First Responder Lab, with an additional five companies working on technology for fire fighters, police, paramedics and other emergency services.
Established in May 2019, the 5G First Responder Lab has grown to now include 15 startups working together on new technology solutions with the goal of making the products commercially viable to run on a 5G network in the future.
Nick Nilan, head of product development for the public sector at Verizon, told Electronics360 that some traction is already happening as Aerial Applications, a drone mapping company that designed software to offer views of the sky over disaster torn areas, was recently awarded a contract with the federal government. While initially the contract will be for the drone mapping technology to run over a 4G network, it will eventually transition to 5G when the network is further deployed and developed.
Edgybees, which augments live video feeds with geo-information layers captured from drone, vehicle, CCTV and mobile cameras, has been working in Australia to help fight the country-wide wildfires raging in the country. Edgybees provides real-time video feed overlay with maps and redirects people around fires. Again, the technology is being used on 4G networks, but once 5G is widely deployed, it will be ported to the 5G network.
While not every product is currently in use and some are more experimental in nature, most of the 15 companies are developing 5G solutions that will eventually be used commercially, Nilan said.
Autonomous security robots are under development to work with 5G. Source: Knightscope
“We are working with companies that have a track record or a solid vision on bringing a new product to market,” Nilan said. “One of the first participants was Blueforce that uses a sensor fusion platform to enable smarter policing that has been working with police forces for 10 years and we are looking to leverage that technology to 5G. So many of these companies, we are very confident that these will come to market.”
Other more experimental offerings include Quake Technologies, which is making augmented reality for firefighters. While the product is not yet available, Verizon is working with Quake to launch it later this year.
“Another company called Lumineye is developing wall penetrating radar that will essentially allow law enforcement or fire fighters to see through walls either through rubble or in a burning building or hostage situation,” Nilan said. “What we didn’t look at in the 5G First Responder Lab was very far off ideas that are not going to be something that can be not attained, that are just ideas.”
The five companies in Verizon’s third cohort for its 5G First Responder Lab include Lumineye and Edgybees as well as Knightscope, a maker of autonomous security robots, SimX, a maker of medical simulation software for virtual and augmented reality platforms, and ZeroEyes, a software company that helps responders predict, prevent and protect against active shooter scenarios at schools and other public spaces.
The next step for the 5G First Responder Lab is to expand the number of companies working on technology offerings. Verizon already has open applications for its fourth cohort, which will kick off next month and will focus on how 5G can be used to enable faster dispatch times.
“This is going to be the year where we see more deployments with 5G and so we want to have more solutions ready for when 5G is ready,” Nilan said.
To contact the author of this article, email Peter.Brown@ieeeglobalspec.com
Schools are using facial recognition to try to stop shootings. Here’s why they should think twice.
See the full article HERE.
Facial recognition is just one of several AI-powered security tools showing up at schools.
For years, the Denver public school system worked with Video Insight, a Houston-based video management software company that centralized the storage of video footage used across its campuses. So when Panasonic acquired Video Insight, school officials simply transferred the job of updating and expanding their security system to the Japanese electronics giant. That meant new digital HD cameras and access to more powerful analytics software, including Panasonic’s facial recognition, a tool the public school system’s safety department is now exploring.
Denver, where some activists are pushing for a ban on government use of facial recognition, is not alone. Mass shootings have put school administrators across the country on edge, and they’re understandably looking at anything that might prevent another tragedy.
Safety concerns have led some schools to consider artificial intelligence-enabled tools, including facial recognition software; AI that can scan video feeds for signs of brandished weapons; even analytics tools that warn when there’s been suspicious movement in a usually-empty hallway. Recode has identified about 20 companies that have sold or have expressed interest in selling such technology to educational institutions.
On its face, facial recognition seems like it might help keep kids safe; in a promotional video by Panasonic, a Denver public school official argues that the company’s AI could be used to prevent potentially dangerous people — like students expelled because they brought a weapon to school or someone barred by a restraining order — from entering a school campus (though the public system has not yet implemented the tool). Most schools appear to be thinking about facial recognition as a way to regulate entry onto a campus, creating databases of people who have previously been flagged.
But facial recognition and similar software have also been suggested for more routine tasks at school, like taking attendance and investigating code of conduct violations. And critics add that it’s not apparent that this software works as advertised, and, with relatively few trials in schools, there’s no real guarantee it will actually make students safer.
High-tech security software could make students feel policed and surveilled, and research has already demonstrated that facial recognition can be inaccurate, especially for people of color and women, as well as other groups. (Those findings were confirmed by a National Institute of Standards and Technology report released Thursday.) Meanwhile, legislation explicitly regulating the use of these tools remains scant, and some critics worry that the sensitive data that facial recognition systems create could ultimately be shared with law enforcement, or a federal agency such as Immigration and Customs Enforcement (ICE).
“Facial recognition is biased, broken, and it gets it wrong. It’s going to put a lot of students in danger, especially students of color,” warns Albert Fox Cahn, the executive director and founder of a legal nonprofit called the Surveillance Technology Oversight Project. “We know that this technology will get it wrong quite a bit, and we also have no evidence to show that it has any public safety benefit whatsoever, especially in the grandiose scenarios that proponents put forward.”
Companies market facial recognition as a safety tool
Here’s how the tech could work in a school setting. Facial recognition technology compares images or videos of people entering or within a school building, with a database of already-known individuals and near-instantly confirms their identity, usually for the purpose of alerting security staff or automatically admitting someone into an area.
That database could include a school’s current staff and parents who have been approved to enter a school; it might also include particular individuals a school does not want on its premises, such as expelled students, former employees, registered sex offenders (or those listed on other court-administered databases), or other people school officials might decide to deem suspicious (and have an image of).
Which means that to make use of these tools to preemptively stop a violent event, school staff would have to already know that a person was potentially dangerous and unwelcome on campus — and flag them in the system.
It’s important to note that school shooters are often not previously banned by school staff. Systems like these, though, could theoretically allow a school official to flag a student for any reason, or no reason at all (and regulation of these tools isn’t clear, more on that later).
Compared to expelled students or other people deemed threats, schools appear to be more apprehensive about entering sensitive data about the entire student body into a facial recognition database. That’s part of why, when Lockport School District in New York announced that it would install a facial recognition tool, the state’s department of education called for the plan to be put on hold. The school district and state education officials have since been going back-and-forth over rules for implementing the system to ensure the database will only be used on flagged, non-students, and not students themselves.
Ultimately, we don’t know how many schools have made use of a facial recognition-based tool (Wired found eight public school systems), but it’s not clear that any of the systems deployed at schools thus far have yet stopped a violent event. Mike Vance, a senior director of product management at RealNetworks — which currently provides schools recognition software for free or at a discounted rate — said he’s aware of one school that set up an alert on a person who had expressed general plans for a school shooting in the surrounding area (that person didn’t show).
Another school, Vance said, set up an alert on a student who the school administrators had reason to believe would threaten the school’s safety (that student didn’t show, either). He added that, generally, RealNetworks’ system is not used for recognizing students.
While Vance says that RealNetworks directs schools to its guidelines for best practices, he emphasizes that the company can’t control, and can’t access or see, how schools are actually using the tool.
Importantly, there’s no one type of company that might bring facial recognition to a school. While it’s possible the technology could require installing new, higher-quality surveillance cameras, some can function as a software extension of a school’s existing security infrastructure. Some companies may produce facial recognition software themselves, but not always. For instance, the Oklahoma City-based security firm TriCorps has deployed a Panasonic facial recognition tool, called FacePro, at a school in Missouri. Panasonic also appears to offer facial recognition to schools directly, as in the case of Denver Public Schools. (Panasonic did not respond to a request for comment.)
“Facial recognition is becoming more broadly available and often as a new function in already established CCTV/[s]ecurity products,” explained Daniel Schwarz, a privacy and technology strategist at the New York Civil Liberties Union in an email to Recode. “School districts could unwittingly purchase face surveillance tools without even knowing about it.”
He said the NYCLU had spoken to at least one school district that bought a biometric tool but “wasn’t aware of the functionality.”
Beware of “mission creep”
Facial recognition could do more than notify officials when people suspected to be dangerous enter schools. For school officials, that might seem like more bang for their buck, but critics worry that excessive use of the tool could turn into surveillance of students. “We don’t have a single example of a costly and invasive surveillance tool that’s deployed that’s only used for the thing we’re told it will be,” Cahn said.
Mike Vance, RealNetworks’s product management senior director, says that schools are using facial recognition to preemptively enforce child custody agreements. He gave examples of schools that have set alerts in their facial recognition systems on birth parents who have been barred by court order — or other legal processes — to make contact with a child. (He’s not aware of any cases in which a school has caught a parent in this way.) Wired reported that a facial recognition system was even used to check whether a student believed to have run away from home had shown up at school.
There’s particular worry that facial recognition tools could be used to police and investigate student behavior. The superintendent of one New York school district considering the technology floated the idea of using it to enforce codes of conduct, according to the Buffalo News. That’s concerning to critics who point out that facial recognition can be especially inaccurate when applied to people of color, and women with darker skin, in particular (you can read more about bias in facial recognition here, here, and here) and could worsen the school-to-prison pipeline.
“[F]acial recognition technology will necessarily mean Black and brown students, who are already more likely to be punished for perceived misbehavior, are more commonly misidentified, reinforcing the criminalization of Black and brown people,” wrote NYCLU organizer Toni-Smith Thompson last year. “That will happen even as facial recognition algorithms get better at correctly recognizing people’s faces.”
Facial recognition is already being used to take attendance, an application that would presumably require a database of identifiable information on every student at a school. In the US, at least one company, Face Six, sells attendance-taking facial recognition to educational institutions. The technology is in about two dozen educational institutions (including both in the US and elsewhere), a number its CEO says reflects a “mix” of private and public schools, as well as universities.
Facial recognition-as-attendance is also popping up abroad, including in China (though its use there may be curbed). Orahi, a startup that works in India, appears to be using Amazon’s controversial facial recognition tool Rekognition to automatically take the role on school buses and in schools. (Amazon did not respond to a request as to whether it’s sold Rekognition to other startups that work with students or at schools.) In Sweden earlier this year, a municipality was fined after a local school tested using facial recognition to track student attendance, in violation of the General Data Protection Regulation (GDPR); a similar tool in Australia also sparked backlash.
Facial recognition isn’t the only AI-based security tool schools are using
Another increasingly popular application of AI is weapon detection. The idea is to use AI to understand the image of a weapon (like a handgun or an assault rifle), and then alert school staff anytime a corresponding item is recognized in a security video feed. “At a very simple level, we are going out and sourcing images and videos of guns [and] of guns being pulled in a variety of scenarios, [and] different types of weapons, like knives, guns, and rifles. And we’re just collecting as many data points as we can about what a gun looks like or what a weapon looks,” explains Trueface CEO Shaun Moore.
Moore says he isn’t aware of a violent event that his software has stopped yet, though he emphasizes that it’s early. But the technology is growing more widespread. Another company, Actuate, says its system is in use at “almost a dozen private schools and school districts.” ZeroEyes, another gun-detection service, says its tool is being used at eight locations and is closing contracts with 30 more, most of which are schools.
“The way to think about how this type of AI works is that it can recognize the shape of a gun in the same way that a human can, but it can’t understand the context,” explained Actuate’s chief product officer and cofounder, Ben Ziomek, in an email. “If an object looks like a weapon to a human in a few frames, our system will mark it as a weapon.” The system could theoretically flag prop firearms used for a school play, or certain replica toy weapons.
But while the technology is sometimes sold in conjunction with facial recognition, it still comes with risks. SN Technologies, which is offering weapons detection in addition to facial recognition to Lockport public schools, said that during one test its system falsely flagged a walkie-talkie pointed like a handgun. Several of the companies admit that their systems could produce false positives — while also claiming high accuracy rates — and emphasize that school security staff are responsible for checking that the software has flagged a real weapon.
“We just want to help that person make that decision faster,” Moore said. “It’s very difficult to monitor that many camera feeds in real time.”
“[W]e’ve had incidents where students were brandishing mock weapons used for a school play,” explained Ziomek. “An off-site team would have called law enforcement because the weapon looked completely real, but the security staff on-site knew the context of the situation and gave the students a firm talking-to rather than calling the police.”
But critics say these systems shouldn’t necessarily be trusted. “When you add on all of the visual noise of being in a school with hundreds of people in moving around — and all these things in motion — and no static background, there are a lot of different everyday objects that will end up setting it off,” Cahn said.
There are also other, AI-based tools that schools can purchase, like a “self-learning analytics” feature sold by the Canadian security-technology firm Avigilon, a company owned by Motorola Solutions. The company explains that its AI studies video feeds collected by cameras throughout a school and learns normal patterns of traffic. That means it can flag unusual activity — like a lot of motion in a hallway at a time when it’s normally deserted.
The company also sells an “appearance search” feature, which is on track to be used by Florida’s Broward County School District, including Marjory Stoneman Douglas High School (it’s already in other schools). For instance, with a school safety official could observe a video of a person that appears in a classroom at 2 pm, and search to see where else that person has appeared on a school’s video feed, based on characteristics of their face, their clothing, and gender, among other factors.
The US has been slow to regulate facial recognition systems
Facial recognition requires creating databases of sensitive and personally-identifiable data — immutable information about our faces — that we may not want schools to possess. For one thing, the Surveillance Technology Oversight Project’s Cahn is doubtful school officials are prepared to keep such information secure and protected from hackers.
But, like other critics, he’s also worried about whether these systems will be used to target undocumented students and students of color. “Many school districts have a history of working hand-in-hand with law enforcement to create the school-to-prison pipeline, so we certainly can’t trust that schools will push back against a request from law enforcement,” Cahn said. “But even if these schools were to oppose [law enforcement], they simply don’t have a legal mechanism to block the government from getting a court order to obtain this data.”
According to the agency’s website, ICE considers schools “sensitive locations,” meaning that they’re not supposed to be targeted for enforcement activities unless officers are led to the location by other “law enforcement actions,” there are “exigent circumstances,” or prior approval is obtained from a “designated supervisory official.”
It’s worth noting that some schools already have agreements with police departments to share access to their live video feeds. For instance, the Suffolk Police Department in Long Island, New York, operates a program called “Sharing to Help Access Remote Entry (SHARE) where officers can remotely access school security video feeds. The system is meant to be used in an emergency (like an active shooter situation), and already uses a license plate identification system that can also determine the make and color of a car that’s parked on campus.
The county’s police chief, Stuart Cameron, told Recode that the department is exploring facial recognition technology (like it would explore any tool). If it does choose to adopt facial recognition, he says there’s no reason why the department wouldn’t use the tool in the conjunction with the SHARE program.
Meanwhile, laws that clearly apply to these tools are few and far between. Federal regulation governing facial recognition nationally doesn’t exist yet, though it’s possible existing privacy or education laws — like the Family Educational Rights and Privacy Act — could be applied to certain applications of the technology. Still, the US Department of Education told Recode that it hasn’t issued any specific guidance regarding facial recognition.
On the state level, Illinois and Texas have both passed biometric information privacy laws that appear to require consent before using facial recognition. RealNetworks’s Vance says his company notifies schools of this legislation.
Vance also points to a 2014 Florida state law that explicitly bans the collection of biometric information from students. Moore, of Trueface, told Recode that his company has held off on deploying its facial recognition technology in schools because it wants to wait for more clarity regarding regulation.
Meanwhile, at the local level, a city spokesperson for Somerville, Massachusetts — one of the first US cities to ban facial recognition — says that law includes the city’s public schools. However, a legislative aide who worked on San Francisco’s facial recognition ban told Recode the law would not directly apply to public schools (meaning schools could technically buy a facial recognition tool), but that the city’s ban means that San Francisco police couldn’t use or receive information the system might collect.
This overall legal patchwork has left many, including the very companies selling these technologies, desperate for clearer regulation. “Having federal guidelines or federal regulations around facial recognition would be a really good thing for the industry to make sure we’re all playing by the same set of rules,” Vance, of RealNetworks, said.
Robots, Drones Among Startups in Verizon’s 5G First Responder Lab
See the full article HERE.
Robotics and aerial drone startup companies were among those chosen by Verizon to participate in the company’s “Cohort 3” for the 5G First Responder Lab. The startups in the Cohort 3 group will focus on developing artificial intelligence for weapon detection, geo-intelligence, autonomous security, and smart cities solutions, powered by Verizon’s 5G Ultra Wideband network, the company said in a statement.
The startups were announced at the company’s #OCR2019 event, a three-day public safety seminar hosted by Verizon and Nokia, where technologies for public safety were showcased in live simulations for first responders and government officials. The event included a series of six realistic crisis scenarios that let industry and government stakeholders experience firsthand how advanced technologies can work under pressure in a crisis.
Variety of startups
The five companies chosen for the cohort were:
Edgybees, which augments live video feeds captured from any camera, human input, or other data sources to provide clarity on operational environments.
Ekin, which is developing software and hardware with AI to make modern cities safer and smarter.
Knightscope, which develops autonomous security technologies through self-driving technology, robotics, and AI.
Lumineye, which provides wall-penetrating radar sensing to help first responders identify people through walls.
Zeroeyes, which develops a life-saving active shooter system that uses AI to actively monitor camera feeds to detect weapons.
“As our final cohort of the year, we’re excited to welcome Cohort 3 to the 5G First Responder Lab,” said Nick Nilan, director of public safety product development at Verizon. “The innovative technologies developed and fostered by Cohorts 1 and 2 will make an incredible difference in the market, and we can’t wait to see what Cohort 3 develops during their time in the lab.”
The company’s 5G First Responder Lab has brought together 15 companies in three separate groups (cohorts) that work at Verizon’s 5G Lab in Washington, D.C., to develop public safety solutions. Previous cohorts have focused on advanced imaging, drones, virtual reality training, and education for public safety and smart city technology. The company has also set up similar 5G labs that focus on other areas, including a robotics lab in Boston.
“We’re excited for the great things ahead with Cohort 3 and look forward to finishing the first year of this program strong,” said Nathanial Wish, co-founder & CEO, Responder Corp. “Each cohort has brought a wealth of innovation to the table, ultimately helping to create advanced solutions with potentially life-saving technologies that are built on Verizon 5G Ultra Wideband.”
ZeroEyes: Kieran Carroll outlines capabilities of weapons-detection system built on existing cameras, AI
Watch the video interview HERE.
Written byDonny Jackson
21st November 2019
Kieran Carroll, vice president of operations and government affairs for ZeroEyes, explains how the ZeroEyes is able to leverage existing camera system and the company’s artificial-intelligence-driven solution to identify firearms and immediately alert key personnel of a potential threat before bullets are shot. Carroll spoke with IWCE’s Urgent Communications Editor Donny Jackson yesterday during the Operation Convergent Response (OCR) 2019 event in Perry, Ga., that was co-sponsored by Verizon and Nokia.