Picture2
September 11, 2017

Cornell Researchers Highlight Ethical Lapses in Recent Cybersecurity Failures

Print More

The internet is everywhere.

From simple dial-up connections on bulky computers, the spread of internet access to watches, cameras, printers, refrigerators and televisions demonstrates the progress the computing industry has made. Connectivity is lauded for making our lives convenient and efficient.

However, the increasing frequency of malware attacks and data leaks suggests that advancements in cybersecurity are not keeping pace. As a testament to this fact, on Sept. 7, 143 million U.S. residents had personal information like their social security and driver’s licence numbers compromised from credit reporting agency Equifax. The Sun sat down with Prof. Stephen Wicker, electrical and computer engineering, and Prof. Emin Gun Sirer, computer science, to discuss related threats, ethical questions and solutions for the future.

A rising threat seems to be coming from malware known as ransomware. This past May, ransomware attacks impacted more than 10,000 organizations running Microsoft Corporation’s Windows operating system in over 150 countries. The malware responsible, WannaCry, was reportedly stolen from the U.S. National Security Agency in April.

Individuals, government agencies, academic institutions and businesses have all been victim to ransomware for nearly a decade. Such malware encrypts files on a computer and threatens to destroy them if a ransom, in bitcoins, is not paid within a certain period of time. Attacks have particularly affected hospitals, where doctors and nurses have lost access to patient records, putting lives at risk. In fact, earlier this year the Hollywood Presbyterian Medical Center in California paid $17,000 in bitcoins after being offline for a week due to malware attacks.

WannaCry originated from software the NSA used for data collection and surveillance that exploited vulnerabilities in Windows. Only after the software was stolen and used to carry out attacks did the NSA inform Microsoft of the vulnerabilities.

Incidents like these raise important questions about the origin of malware software and the ethical responsibilities of their creators.

Wicker’s research is primarily focused on information systems and networks, with a particular emphasis on ethics and the law. He said that people misunderstand the NSA lapse as a legal issue when it is actually an ethical issue.

“Do the individuals and organizations involved have an ethical obligation?” Wicker said. “I think so.”

Wicker acknowledges that it is obviously important to continue security surveillance, for example, to prevent terror attacks, but the tradeoffs need to be properly considered.

“There are other ways to do police work, in my opinion,” Wicker said. He said he believes the NSA should have informed Microsoft of vulnerabilities in their software earlier.

“The government’s obligation to build a secure computing infrastructure overrides the intelligence community’s desire to collect data,” Gun Sirer said.

Similar questions arise for those responsible for keeping these devices secure.

After becoming aware of the vulnerabilities, Microsoft issued a patch to users to secure the bugs, but not all users complied. Therefore, not all computers were secure due to a phenomenon known as the “free-rider” system.

“This ‘free-rider’ problem — some manufacturers and users choosing to enjoy the benefits of the internet without taking the time and effort to maintain secure computing systems — is unethical, and is a problem that will get much worse as the internet of things continues to grow,” Wicker said.

As opposed to the computers that run Windows, many internet-connected devices do not have dedicated engineering teams issuing security patches, leaving them vulnerable to hacks.

Gun Sirer believes that vendors should be responsible for security maintained on computers and objects connected to the internet of things. Furthermore, he feels that it is much more important to fix vulnerabilities than to keep them a secret.

Finally, important questions have been raised about the regulation of currencies like bitcoin because it seems to be the preferred mode of ransom payment.

“Since organizations behind ransomware are large and underground, many of them go through the route of encrypting and holding files for ransom but many also outright steal bitcoins,” Gun Sirer said.

Ransom payments are hard to track because people are very likely to pay them, especially hospitals who need immediate access to patient information. Additionally, without a central regulator that monitors the movement of coins, accurately tracking payments is nearly impossible. Finally, the coins are easily transferable between countries because they bypass traditional banking systems, allowing such attacks to spread easily.

Consequently, Gun Sirer’s research focuses on regulating and securing cryptocurrencies such as bitcoin. His team developed Volt technology, which enables people to override thefts and reclaim stolen tokens. In response to such attacks, Gun Sirer and his team have helped different Cornell entities, including the University Treasurer, develop a disaster preparedness plan to combat such attacks.

While the immediate consequences of attacks over the past few months have been severe, they have opened up debates both in the intelligence and computing communities on the ethical questions in cybersecurity. Both Wicker and Gun Sirer specialize in different areas, but agree on some common principles: that surveillance, while necessary, cannot override the need to secure the data of average citizens and that those responsible for building computer infrastructure need to keep it secure.

  • Michael DeKort

    Lockheed Whistleblower – Equifax & most hacks root cause = Organizations literally avoiding a critical best practice
    https://www.linkedin.com/pulse/lockheed-whistleblower-equifax-most-hacks-root-cause-michael-dekort/

    Privileged Account Security – The massive hole in most organizations cybersecurity – Organizations are avoiding this key best practice on purpose!

    This is why OPM, Yahoo, Equifax and others were hacked. And why most organizations and companies have already been hacked and do not know it or will be.

    My name is Michael DeKort. I am a former systems engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the Aegis Weapon System, NORAD and on C4ISR for the US Coast Guard and DHS. I also worked in Commercial IT. Including at CyberArk and in banking, healthcare and insurance. I also received the IEEE Barus Ethics Award for whistleblowing regarding the DHS/USCG Deepwater program post 9/11.
    https://www.thenation.com/article/will-botched-coast-guard-contract-come-back-bite-james-comey

    The overwhelming majority of companies and government organizations are avoiding the most critical cyber-security practice of all. Dealing with privileged account security. It’s the biggest dirty secret in cybersecurity. Which is extremely unfortunate because virtually every hack on record was accomplished by someone gaining access to a privileged account then moving through the system. This usually occurs due to a successful phishing expedition. (Of which 22% are successful. Keep in mind only one is needed). Also most hackers are in a system for almost 6 months before being detected.

    Of the small fraction of companies that even deal with this area and purchase products few of them actually use the products they purchase properly. Many install them then slow roll actually using them to any significant degree for decades. Often this is meant to purposefully deceive C-Suite and regulators. This puts everyone at risk.
    Here is how bad things are. CMU CERT is the premier authority on cyber-security best practices. Especially for DoD. I found out that CMU CERT has no solution for themselves in this area. They actually defer to CMU IT for their own security and they have no solution in this area. Shouldn’t the organization responsible for telling others what best practice is use best practices for its own security?

    Why is this happening? IT leaders have no problem with firewalls, anti-virus or monitors of any systems except privileged accounts etc because those things are additive, don’t cause them to drive cultural habit changes or expose massive best practice issues. That leaves huge cybersecurity best practice gaps.

    Examples include having 4X more accounts than people, non-encrypted password files or spreadsheets, emails with passwords and software programs with passwords hard coded in them and many not knowing where they all are. As a result of this many passwords are not changed for decades. Especially for applications or databases. There is also the problem of having local admin permissions available on laptops and end points and not knowing where they all are either. Fixing those issues would also require forcing the masses to do things differently. Few have the desire to be part of any of that. In spite of “continuous process improvement” etc.
    Governing bodies and regulators mean well but they don’t help much either. They try to avoid being to specific to let the industry figure out best practices, do what is right for them or avoid government being too involved. Most of it is nonsense. This gives organizations far too much room to wiggle. Which they have no problem exploiting. Most companies and organizations doing the least amount possible.

    This is not a technical issue. It’s one of Courage. Courage to admit the problems exist and to deal with the culture and lead them to fix them. And to not sacrifice customers or the public to protect egos or let the bean counters justify it’s cheaper to harm customers than the bottom line.

  • Michael DeKort

    Lockheed Whistleblower – Equifax & most hacks root cause = Organizations literally avoiding a critical best practice

    Privileged Account Security – The massive hole in most organizations cybersecurity –

    Organizations are avoiding this key best practice on purpose!
    This is why OPM, Yahoo, Equifax and others were hacked. And why most organizations and companies have already been hacked and do not know it or will be.
    My name is Michael DeKort. I am a former systems engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the Aegis Weapon System, NORAD and on C4ISR for the US Coast Guard and DHS. I also worked in Commercial IT. Including at CyberArk and in banking, healthcare and insurance. I also received the IEEE Barus Ethics Award for whistleblowing regarding the DHS/USCG Deepwater program post 9/11.

    The overwhelming majority of companies and government organizations are avoiding the most critical cyber-security practice of all. Dealing with privileged account security. It’s the biggest dirty secret in cybersecurity. Which is extremely unfortunate because virtually every hack on record was accomplished by someone gaining access to a privileged account then moving through the system. This usually occurs due to a successful phishing expedition. (Of which 22% are successful. Keep in mind only one is needed). Also most hackers are in a system for almost 6 months before being detected.

    Of the small fraction of companies that even deal with this area and purchase products few of them actually use the products they purchase properly. Many install them then slow roll actually using them to any significant degree for decades. Often this is meant to purposefully deceive C-Suite and regulators. This puts everyone at risk.

    Here is how bad things are. CMU CERT is the premier authority on cyber-security best practices. Especially for DoD. I found out that CMU CERT has no solution for themselves in this area. They actually defer to CMU IT for their own security and they have no solution in this area. Shouldn’t the organization responsible for telling others what best practice is use best practices for its own security?

    Why is this happening? IT leaders have no problem with firewalls, anti-virus or monitors of any systems except privileged accounts etc because those things are additive, don’t cause them to drive cultural habit changes or expose massive best practice issues. That leaves huge cybersecurity best practice gaps.
    Examples include having 4X more accounts than people, non-encrypted password files or spreadsheets, emails with passwords and software programs with passwords hard coded in them and many not knowing where they all are. As a result of this many passwords are not changed for decades. Especially for applications or databases. There is also the problem of having local admin permissions available on laptops and end points and not knowing where they all are either. Fixing those issues would also require forcing the masses to do things differently. Few have the desire to be part of any of that. In spite of “continuous process improvement” etc.
    Governing bodies and regulators mean well but they don’t help much either. They try to avoid being to specific to let the industry figure out best practices, do what is right for them or avoid government being too involved. Most of it is nonsense. This gives organizations far too much room to wiggle. Which they have no problem exploiting. Most companies and organizations doing the least amount possible.

    This is not a technical issue. It’s one of Courage. Courage to admit the problems exist and to deal with the culture and lead them to fix them. And to not sacrifice customers or the public to protect egos or let the bean counters justify it’s cheaper to harm customers than the bottom line.

  • Michael DeKort

    Lockheed Whistleblower – Equifax & most hacks root cause = Organizations literally avoiding a critical best practice

    Privileged Account Security – The massive hole in most organizations cybersecurity –
    Organizations are avoiding this key best practice on purpose!

    This is why OPM, Yahoo, Equifax and others were hacked. And why most organizations and companies have already been hacked and do not know it or will be.

    My name is Michael DeKort. I am a former systems engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the Aegis Weapon System, NORAD and on C4ISR for the US Coast Guard and DHS. I also worked in Commercial IT. Including at CyberArk and in banking, healthcare and insurance. I also received the IEEE Barus Ethics Award for whistleblowing regarding the DHS/USCG Deepwater program post 9/11.

    The overwhelming majority of companies and government organizations are avoiding the most critical cyber-security practice of all. Dealing with privileged account security. It’s the biggest dirty secret in cybersecurity. Which is extremely unfortunate because virtually every hack on record was accomplished by someone gaining access to a privileged account then moving through the system. This usually occurs due to a successful phishing expedition. (Of which 22% are successful. Keep in mind only one is needed). Also most hackers are in a system for almost 6 months before being detected.

    Of the small fraction of companies that even deal with this area and purchase products few of them actually use the products they purchase properly. Many install them then slow roll actually using them to any significant degree for decades. Often this is meant to purposefully deceive C-Suite and regulators. This puts everyone at risk.

    Here is how bad things are. CMU CERT is the premier authority on cyber-security best practices. Especially for DoD. I found out that CMU CERT has no solution for themselves in this area. They actually defer to CMU IT for their own security and they have no solution in this area. Shouldn’t the organization responsible for telling others what best practice is use best practices for its own security?

    Why is this happening? IT leaders have no problem with firewalls, anti-virus or monitors of any systems except privileged accounts etc because those things are additive, don’t cause them to drive cultural habit changes or expose massive best practice issues. That leaves huge cybersecurity best practice gaps. Examples include having 4X more accounts than people, non-encrypted password files or spreadsheets, emails with passwords and software programs with passwords hard coded in them and many not knowing where they all are. As a result of this many passwords are not changed for decades. Especially for applications or databases. There is also the problem of having local admin permissions available on laptops and end points and not knowing where they all are either. Fixing those issues would also require forcing the masses to do things differently. Few have the desire to be part of any of that. In spite of “continuous process improvement” etc.

    Governing bodies and regulators mean well but they don’t help much either. They try to avoid being to specific to let the industry figure out best practices, do what is right for them or avoid government being too involved. Most of it is nonsense. This gives organizations far too much room to wiggle. Which they have no problem exploiting. Most companies and organizations doing the least amount possible.

    This is not a technical issue. It’s one of Courage. Courage to admit the problems exist and to deal with the culture and lead them to fix them. And to not sacrifice customers or the public to protect egos or let the bean counters justify it’s cheaper to harm customers than the bottom line.