There is a gaping hole in the defences that companies use to keep cyber thieves.
The hole is a global shortage of qualified personnel, which keeps hardware security analysis of threats and kicks from intruders.
Currently, the global security industry lacks about one million skilled workers, suggests a study ISC2 – body of the industry for professionals in the field of security. The deficit is likely to grow in the next five years to 1.8 million, he said.
A deficit is recognized and gives rise to other problems, says Ian Glover, head Crest – on UK, which certifies the skills of an ethical hacker.
“The deficit leads to increase in cost,” he says. “Undoubtedly, there is some influence, because the companies are trying to buy a scarce resource.
“And this may mean that companies do not get the right people, because they are desperately trying to find someone to fill the role.”
While many countries have taken steps to attract people in the security industry, Mr. Glover warns that these efforts will be insufficient to close the gap.
Help must come from another source: car.
“If you look at the increasing automation of attack tools, you must have the growth of automation in the tools we use to protect ourselves,” he says.
‘Drowning’ in data
That move toward automation in this direction is already underway, says Peter Woollacott, founder and chief Executive officer Sydney safety huntsman, adding that the change is long overdue.
For too long, security was one of the “homemade” exercise, he says.
This is a problem when analysts are expected to protect the company’s “drowning” in data generated by firewalls, PC’s, intrusion detection systems and all other devices that they bought and installed he says.
Automation is nothing new, said Oliver Tavakoli, CTO at network security firm Vectra – early uses helped of antivirus software to Spot the novel of malware.
But now the learning machine, allowing it to go much further.
“Machine learning is clearer and simpler than the AI [artificial intelligence],” says Mr. Tavakoli, but this does not mean that it can only handle simple problems.
Analytical capabilities of machine learning arises from the development of algorithms that can take a huge amount of data and to highlight anomalies or significant trends. The increase in computing power also made it possible.
These “deep learning” algorithms come in different flavors.
Some, such as OpenAI, available to anyone, but most of them belong to the companies that developed them. Therefore, large security firms are snapping up smaller, smarter start-UPS in an effort to quickly strengthen its defenses.
Simon mccalla, chief technology officer at nominet, the domain name registry which oversees .web domain UK says that the learning machine has proved its usefulness as a tool he created called Turing.
It digs up evidence of a web attack from a huge number of requests processed each day queries, finding information about the location of the web sites in the UK.
Mr. mccalla said, Turing helped to analyze what happened during the cyber-attack on lloyds Bank in January that thousands of customers unable to access banking services.
DDoS [distributed denial of service attack generated a huge amount of data to process for this one event, he says.
“As a rule, we process about 50,000 queries every second. With lloyds it was more than 10 times more.”
When the dust settled and the attack was over, the composition had to work all day movement in a few hours.
Turing has absorbed all the information on the servers of composition that he learned how to give early warning of abuse and intelligence on people that are configured for a longer attack.
It logs IP [Internet Protocol] addresses of hijacked machines to send queries to check if the email address is “live”.
“Most of what we see is not that smart, actually,” he says, but adds that without machine learning, it would be impossible for human analysts to determine what happened to the target, for example, the website of the Bank, “darkened”.
The Turing analysis for composition is now helping the UK government to police its internal network. This helps to block access to dodgy domains and become a victim of malware.
Chaos and order
There are also more ambitious efforts to use analytical skills machine learning.
At the Def Con hacker last year, DARPA, the US military research Agency, held a competition that let seven smart computer programs to attack each other to see which was the best to protect yourself.
The winner, called “mayhem,” currently adapt so that they can identify and fix flaws in code that can be used by hackers.
Machine learning can correlate information from many different sources to give analysts a rounded view that a series of events that poses a threat or not, says Mr. Tavakoli.
He can find out the normal ebbs and flows of data in the organization, and that employees usually get up at different times of the day.
So when cyber-thieves to do such things as probe the network connection or trying to make databases that the anomalous behavior raises a red flag.
But thieves have become very good at covering their tracks and the huge network of the “indicators of compromise” can be very difficult for a person to choose.
So now cybersecurity analysts can sit back and let the machine learning systems crunch all the data and to identify features of serious attacks that really deserve attention.
“It’s like surgeons who just cut,” says Mr. Tavakoli. “They don’t prep the patient, they just work there and do it very well.”
Technology follow us business editor Matthew wall at Twitter and Facebook