Sep 242013
 

DARPA’s Plan to Flood the Sea With Drones, Carrying More Drones

 

WIRED / Danger Room
by Allen McDuffee
September 13, 2013

 

DARPA’s Plan to Flood the Sea With Drones, Carrying More Drones

DARPA’s Plan to Flood the Sea With Drones, Carrying More Drones

 

DARPA, the Pentagon’s research agency, has recently revealed its plans to boost the Navy’s response to threats in international waters by developing submerged unmanned platforms that can be deployed at a moment’s notice.

Hydra, named after the serpent-like creature with many heads in Greek mythology, would create an undersea network of unmanned payloads and platforms to increase the capability and speed the response to threats like piracy, the rising number of ungoverned states, and sophisticated defenses at a time when the Pentagon is forced to make budget cuts. According to DARPA, the Hydra system ”represents a cost effective way to add undersea capacity that can be tailored to support each mission” that would still allow the Navy to conduct special operations and contingency missions. In other words, the decreasing number of naval vessels can only be in one place at a time.

“The climate of budget austerity runs up against an uncertain security environment that includes natural disasters, piracy, ungoverned states, and the proliferation of sophisticated defense technologies,” said Scott Littlefield, DARPA program manager, in a statement. “An unmanned technology infrastructure staged below the oceans’ surface could relieve some of that resource strain and expand military capabilities in this increasingly challenging space.”

The Hydra system is intended to be delivered in international waters by ships, submarines or aircraft with the integrative capability of communicating with manned and unmanned platforms for air, surface, and water operations.

Unlike the Upward Falling Payloads (UFPs) program DARPA announced in January that would submerge massive waterproof containers intended to store weapons, drones and supplies for years at a time, Hydra is a highly mobile platform that can be deployed for a few weeks or months in relatively shallow international waters.

“By separating capabilities from the platforms that deliver them, Hydra would enable naval forces to deliver those capabilities much faster and more cost-effectively wherever needed,” said Littlefield. “It is envisioned to work across air, underwater, and surface operations, enabling all three to perform their missions better.”

Proposals are due October 22, but it may well 2018 before Hydra lands in the ocean.

 

Direct Link:  http://www.wired.com/dangerroom/2013/09/hydra-darpa/

Jul 012013
 

Defense Department building its own secure 4G network

The department hopes new network will improve collaboration among separate branches of the military, the chairman of the Joint Chiefs of Staff says.

C/NET
by Steven Musil
June 27, 2013

Defense Department building its own secure 4G network

Defense Department building its own secure 4G network

 

The U.S. Department of Defense is building its own secure 4G network to improve collaboration among separate branches of the military, according to the chairman of the U.S. Joint Chiefs of Staff.

The network is part of an effort dubbed “Joint Information Environment,” which will consolidate 15,000 Defense Department networks in the cloud, Army Gen. Martin Dempsey said in a speech (PDF) delivered Thursday at the Brookings Institute, an influential think tank based in Washington, D.C. In addition to greater collaboration, the new network will be “significantly more secure, helping ensure the integrity of our battle systems in the face of disruption,” Dempsey said.

The network, which will allow access to a variety of mobile devices, is expected to be operational by the middle of next year, Dempsey said, as he gave a preview of the type of security to which service people will be privy.

“This phone would make both Batman and James Bond jealous,” he said, holding up what he said was a secure mobile phone. “With tools like this, the smartphone generation joining our military will help us pioneer a new era of mobile command and control.”

Part of the plan is a federated app store that will allow Defense Department users to share content across several devices, he said.

“By using off-the-shelf technology, we are bringing the full force of the tech revolution into the classified environment,” Dempsey said.

Earlier this year, the U.S. Defense Information Systems Agency approved the use of Apple iOS 6 devices, Galaxy S4, and BlackBerry 10 devices by U.S. government and military departments that tap into the Department of Defense networks. The Defense Department currently has more than 600,000 commercial mobile devices in operational and pilot use, including 470,000 BlackBerrys, 41,000 Apple devices, and 8,700 Android devices.

Noting that the U.S. military has made significant progress in embracing the cyber realm, Dempsey echoed previous Defense Department concerns that efforts to protect critical private-sector infrastructure facilities are “lagging.”

“Too few companies have invested adequately in cybersecurity. I worry that adversaries will seek to exploit this chink in our nation’s armor,” the general said. “To them, our economy and infrastructure are softer targets than our military.”

Improving battlefield communications infrastructure has been a prominent goal of the Defense Department. The Defense Advanced Research Projects Agency announced last December it was looking for ideas on how to update the military’s wireless communications platform to deliver 100Gbps connections.

Related stories

Direct Link:  http://news.cnet.com/8301-1035_3-57591445-94/defense-department-building-its-own-secure-4g-network/

 

Dec 312012
 

Here’s How Darpa’s Robot Ship Will Hunt Silent Subs

WIRED / Danger Room
By Spencer Ackerman
December 27, 2012

 

DARPA's former headquarters in the Virginia Square neighborhood of Arlington. This agency recently moved to 675 North Randolph Street, near the Ballston Common Mall.

DARPA’s former headquarters in the Virginia Square neighborhood of Arlington. This agency recently moved to 675 North Randolph Street, near the Ballston Common Mall.

 

Submariners like to say there are two kinds of ships: subs and targets. The Pentagon’s futurists want to turn that aphorism on its head, and develop a new kind of surface ship that can turn a sub into a target. Naturally, the sub-hunter won’t have a human on board. Here’s how it’s going to work.

The video above is a new promotional piece of machinima (do people still say that?) released by the defense contractor Science Applications International Corporation, which has a $58 million contract with Darpa to build its unmanned sub-hunter of the future. That maritime robot, called the Anti-Submarine Warfare Continuous Trail Unmanned Vehicle, or ACTUV, doesn’t exist yet and won’t for years. But here SAIC at least sketches out how the long, thin and “radically different” ACTUV can keep surface ships from becoming targets.

 

The really interesting thing here is how different the surface-gliding ACTUV is from the now-familiar drones that litter the skies. Even the longest-flying drones can only stay in the air for 30 hours or so. SAIC intends for this thing to stay on a hunt for 60 to 90 days.

What’s more, SAIC is designing the ACTUV to be way more autonomous than contemporary drone aircraft: After a sailor powers it up and helps guide it out of port, she can go on a long vacation while the ACTUV speeds out to the open water to use its long-range acquisition sonar and other advanced sensors to scan for submarines, while automatically steering clear of any nearby surface ships.

Assuming SAIC isn’t over-promising (much), the sonar pods underneath the belly of the ACTUV create an acoustic image of a submarine and pursue it at high speed — although that’s something that can only happen when the ACTUV gets fairly close to its quarry. (More on that in a second.) Once the ACTUV thinks it’s got something, it pings nearby Navy ships through a satellite link. If a sailor thinks the ACTUV has made a mistake, he can convey that back to the unmanned ship and it’ll move on.

If not, the ACTUV operates alongside the fleet, with coordination not often seen with aerial drone tactics. SAIC apparently wants the ACTUV in constant communication with a mothership and Naval aircraft that would fly overhead and drop sonar charges to hunt the mystery sub, with the ACTUV speeding along to keep pace with the swift submarine. SAIC seems to intend for the ACTUV to follow the sub back to its home port (!) if necessary, or until a human in the fleet commands it to break contact. The ACTUV, in case you were wondering, isn’t armed.

How all this will happen isn’t yet clear. The subs that really give the U.S. Navy pause are cheap diesel-electric models, which are technologically puny compared to the Navy’s nuclear-powered subs but can be much quieter and harder to track. Russia sells them; Iran claims to have them. SAIC’s video suggests that the ACTUV can’t actually find the diesel-electric sub on its own: The scenario here depends on a Navy commander suspecting there’s an enemy sub in the area and deploying a P8 Poseidon surveillance aircraft (successor to the P-3C Orion) to drop sonar buoys to find it. The ACTUV sprints out in a certain pattern while “predict[ing] that long-range sensors will be able to completely envelop” the area where the sub might be “and prevent successful evasion.” So not an exact science, but its sonars are said to get more precise the closer the ACTUV gets to the suspected sub target.

The on-board hardware described generically in the video relies on “collected data and sophisticated logics” to “infer the intent” of watercraft. So that should at least make the ACTUV cognizant of any sizable metal thing that seems to be tracking a Navy ship. And if SAIC is right that the ACTUV can really hear the diesel-electric subs, then that enemy sub really may become the ocean’s newest target.

Direct Link:   http://www.wired.com/dangerroom/2012/12/actuv/

Nov 252012
 

Court OKs warrantless use of hidden surveillance cameras

In latest case to test how technological developments alter Americans’ privacy, federal court sides with Justice Department on police use of concealed surveillance cameras on private property.

 

C/NET News
by DeClan McCullagh
October 30, 2012

 

Warrantless Government Surveillance

 

Police are allowed in some circumstances to install hidden surveillance cameras on private property without obtaining a search warrant, a federal judge said yesterday.

CNET has learned that U.S. District Judge William Griesbach ruled that it was reasonable for Drug Enforcement Administration agents to enter rural property without permission — and without a warrant — to install multiple “covert digital surveillance cameras” in hopes of uncovering evidence that 30 to 40 marijuana plants were being grown.

This is the latest case to highlight how advances in technology are causing the legal system to rethink how Americans’ privacy rights are protected by law. In January, the Supreme Court rejected warrantless GPS tracking after previously rejecting warrantless thermal imaging, but it has not yet ruled on warrantless cell phone tracking or warrantless use of surveillance cameras placed on private property without permission.

Yesterday Griesbach adopted a recommendation by U.S. Magistrate Judge William Callahan dated October 9. That recommendation said that the DEA’s warrantless surveillance did not violate the Fourth Amendment, which prohibits unreasonable searches and requires that warrants describe the place that’s being searched.

“The Supreme Court has upheld the use of technology as a substitute for ordinary police surveillance,” Callahan wrote.

Two defendants in the case, Manuel Mendoza and Marco Magana of Green Bay, Wis., have been charged with federal drug crimes after DEA agent Steven Curran claimed to have discovered more than 1,000 marijuana plants grown on the property, and face possible life imprisonment and fines of up to $10 million. Mendoza and Magana asked Callahan to throw out the video evidence on Fourth Amendment grounds, noting that “No Trespassing” signs were posted throughout the heavily wooded, 22-acre property owned by Magana and that it also had a locked gate.

 

U.S. Attorney James Santelle, who argued that warrantless surveillance cameras on private property “does not violate the Fourth Amendment.” (Credit: U.S. Department of Justice)

Callahan based his reasoning on a 1984 Supreme Court case called Oliver v. United States, in which a majority of the justices said that “open fields” could be searched without warrants because they’re not covered by the Fourth Amendment. What lawyers call “curtilage,” on the other hand, meaning the land immediately surrounding a residence, still has greater privacy protections.

“Placing a video camera in a location that allows law enforcement to record activities outside of a home and beyond protected curtilage does not violate the Fourth Amendment,” Justice Department prosecutors James Santelle and William Lipscomb told Callahan.

As digital sensors become cheaper and wireless connections become more powerful, the Justice Department’s argument would allow police to install cameras on private property without court oversight — subject only to budgetary limits and political pressure.

About four days after the DEA’s warrantless installation of surveillance cameras, a magistrate judge did subsequently grant a warrant. But attorneys for Mendoza and Magana noticed that the surveillance took place before the warrant was granted.

“That one’s actions could be recorded on their own property, even if the property is not within the curtilage, is contrary to society’s concept of privacy,” wrote Brett Reetz, Magana’s attorney, in a legal filing last month. “The owner and his guest… had reason to believe that their activities on the property were not subject to video surveillance as it would constitute a violation of privacy.”

A jury trial has been scheduled for January 22.

 

Direct Link:  http://news.cnet.com/8301-13578_3-57542510-38/court-oks-warrantless-use-of-hidden-surveillance-cameras/

May 122012
 

Army wants to monitor your computer activity

 

 U.S. Army Times
By Joe Gould – Staff writer
May 5, 2012

In the wake of the biggest dump of classified information in the history of the Army, the brass is searching for ways to watch what every soldier is doing on his or her Army computer.

The Army wants to look at keystrokes, downloads and Web searches on computers that soldiers use.

Maj. Gen. Steven Smith, chief of the Army Cyber Directorate, said the software was one of his chief priorities, joking that it would take the place of a lower-tech solution: “A guy with a large bat behind every user as they go to search the Internet.”

“Now we’ve been in the news — I don’t know if you’ve seen it — with a little insider threat issue,” Smith continued.

Smith did not mention Pfc. Bradley Manning by name. However, the effort comes in the wake of the former intelligence analyst’s alleged leak of hundreds of thousands of pages of classified documents to the anti-secrecy organization WikiLeaks in 2009 and 2010. Manning faces a military trial on 22 counts, including aiding the enemy.

According to Smith, the Army will soon shop for software pre-programmed to detect a user’s abnormal behavior and record it, catching malicious insiders in the act. Though it is unclear how broadly the Army plans to adopt the program, the Army has more than 900,000 users on its computers.

Smith explained how it might work.

“So I’m on the South American desk, doing intelligence work and all of a sudden I start going around to China, let’s say,” Smith said. “That might be an anomaly, it might be justified, but I would sure like to know that and let someone make a decision, almost at the speed of thought.”

The scenario echoes the allegations against Manning: As an intelligence analyst charged with researching the Shiite threat to Iraqi elections, Manning raided classified networks for State Department cables, Afghanistan and Iraq war logs and video from a helicopter attack, according to courtroom testimony.

Software of the type Smith describes is at various stages of development in the public and private sectors. Such software could spy on virtually any activity on a desktop depending on its programming, to detect when a soldier searches outside of his or her job description, downloads massive amounts of data from a shared hard drive or moves the data onto a removable drive.

The program could respond by recording the activity, alerting an administrator, shutting down the user’s access, or by feeding the person “dummy data” to watch what they do next, said Charles Beard, a cybersecurity executive with the defense firm SAIC’s intelligence, surveillance and reconnaissance group.

“It’s a giant game of cat and mouse with some of these actors,” Beard said.

What’s exciting, Smith said, is the possibility of detecting problems as they happen, on what cybersecurity experts call “zero day,” as opposed to after the fact.

“We don’t want to be forensics experts. We want to catch it at the perimeter,” Smith said. “We want to catch this before it has a chance to be exploited.”

A government wide effort

The Army’s efforts dovetail with a broader federal government initiative. President Obama signed an executive order last October that established an Insider Threat Task Force to develop a governmentwide program to deter, detect and mitigate insider threats.

Among other responsibilities, it would create policies for safeguarding classified information and networks, and for auditing and monitoring users.

In January, the White House’s Office of Management and Budget issued a memo directing government agencies that deal with classified information to ensure they adhere to security rules enacted after the WikiLeaks debacle.

Beyond technical solutions, the document asks agencies to create their own “insider threat program” to monitor employees for “behavioral changes” suggesting they might leak sensitive information.

The interagency Insider Threat Task Force is aiming to complete work on the new standards by October. These standards may address training and employee awareness protocols, said John Swift III, senior policy adviser to a task force now working on the draft policy.

Deanna Caputo, lead behavioral psychologist for Mitre Corp., said both technical solutions and monitoring of human behaviors are needed for a successful detection and prevention program.

“To think that we can tackle the problem simply by technical solutions is a mistake,” Caputo said.

A “culture of reporting” is essential, she said. “We need to up the ante and expect a little bit more from our people” to report abnormal behaviors among their co-workers. However, “there is a fine line with that [reporting]. People need to trust they are in a safe environment to do their job.”

Carnegie Mellon’s Software Engineering Institute has compiled 700 insider threat case studies, and come up with two broad profiles of insiders who steal intellectual property in business settings.

One is an “entitled independent” disgruntled with his job who typically exfiltrates his work a month before leaving. The other is an “ambitious leader” who steals information on entire systems and product lines, sometimes to take to a foreign country, such as China.

According to Patrick Reidy, who leads the FBI’s insider threat program, such users may be conducting authorized activities for malicious ends, and their actions would not register on intrusion detection or anti-virus systems.

“People look at computers and networks but not people and data,” he said. “The insider threat is all about people.”

Reidy, Swift and Caputo discussed the effort at a defense industry convention in Washington, D.C., on April 4.

The ‘Pre-Crime’ division

Private industry and the Defense Advanced Research Projects Agency are among the entities that have technological solutions in various stages of progress.

Raytheon’s SureView software captures any security breach or policy violation it’s programmed to find and can “replay the event like a DVR,” for a local administrator or others to view, according to the company’s website. The software’s trigger is programmable and can be set to any behavior considered suspicious or not.

Working with Raytheon, a group of cadets from the U.S. Military Academy at West Point last year conducted a simulation of an insider attack at a forward operating base. Cadets looked at how to fine-tune the way SureView detects potential threats and eliminate false positives for innocuous behavior, said West Point computer science professor Col. Greg Conti.

“It was very powerful, very flexible and allowed you to monitor with very fine resolution activities on the desktop, and the real trick becomes how you detect anomalous behavior,” Conti said. “Predictive models are kind of the holy grail. When you see that no one else has done something but bad guys, you can start being predictive.”

At SAIC, which is testing a behavior analytics system, Beard likened behavioral modeling to the Pre-Crime unit from the science fiction movie “Minority Report.” Instead of using psychics to stop crimes before they occur, the software would be programmed to detect behavior that has preceded malicious acts in the past.

In real life, researchers are examining the behavior of malicious insiders to see what actions they took before they acted out. That in turn would be used to teach the software what behavior to flag.

“We may want to administer policies that say, ‘Gee, gosh, why do you really want to download 300 [megabytes] of stuff or a gig of data in a single session?’ ” Beard said. “We look for the antecedents of behavior that would suggest based on past history that bad things are going to take place.”

That could be visiting restricted websites, requesting access to information outside of one’s job description or asking for large amounts of storage media — or likely some combination of the above. Individually, the actions may not seem problematic, but combined and in the context of human intelligence, they could raise alarms.

“We start taking those things and recombining them to say, ‘What is going on in the environment?’ ” Beard said. “Any one of those things independently can be totally innocuous and innocent, but when you put them together — plus their job, plus their access, plus the things they are working on — you may be looking at it as a counterintel kind of thing.”

Drawbacks and challenges

Cybersecurity expert Michael Tanji, an Army veteran who has spent nearly 20 years in the U.S. intelligence community, said he sees potential drawbacks and unanswered policy questions. He asked how the Army would implement such technology without unintentionally stifling cross-disciplinary collaboration among soldiers.

Knowing they are being monitored, personnel might avoid enterprising or creative behavior for fear it would be flagged by monitoring software, he said.

Tanji also predicted the technology would come at a considerable financial cost, both to warehouse the data collected by the software and to pay the added staff needed to monitor the reports it generates.

“A brigade-sized element that uses computers on a regular basis would probably need a company-sized element just to keep up with the data that comes in,” he said.

Reidy, the FBI official, said such concerns were valid. Because software may report benign behavior as malicious and vice versa, he cautioned against using technical solutions alone to solve insider threats.

“After a major incident, and no offense to any vendors, but the charlatanism always goes up,” he said. “It’s absolutely amazing how many phone calls I get from people who say they have solved the WikiLeaks problem or solved this or that problem. Everybody’s got to eat, but it’s simply not true.”

Finding bad behavior amid the vast sea of keystrokes, downloads and Web browsing on military computers is no easy task, DARPA acknowledges.

A DARPA solicitation for Suspected Malicious Insider Threat Elimination, or SMITE, announces it is attempting to recognize “moving targets” — telltale patterns of behavior amid “enormous amounts of noise (observational data of no immediate relevance).”

The program, based in behavioral science, would have to distinguish anomalous behavior from normal behavior, and deceptive and malicious behavior from anomalous behavior, the solicitation reads.

A solicitation for another program — Anomaly Detection at Multiple Scales, or ADAMS — uses accused Fort Hood shooter Maj. Nidal Hasan to frame the problem. It asks how to sift for anomalies through millions of data points — the emails and text messages on Fort Hood, for instance — using a unique algorithm, to rank threats and learn based on user feedback.

The program is trying to look beyond computers to spot the point when a good soldier turns, whether that means homicidal or suicidal or ready to dump stolen data.

“When we look through the evidence after the fact, we often find a trail — sometimes even an ‘obvious’ one,” the solicitation states. “The question is, can we pick up the trail before the fact, giving us time to intervene and prevent an incident? Why is that so hard?”

 

Direct Link:  http://www.armytimes.com/news/2012/05/army-wants-to-monitor-your-computer-050512w/