Interview with IT Manager Paul Lanzi

Paul Lanzi is the COO and co-founder of Remediant, an IT security startup that has created a product to protect privileged accounts. Prior to this startup, he worked for many years as an IT manager in the biotech field, managing various engineering teams for Genetech and then Roche.

Back 11 years ago when he started at Genentech, the first security problem he helped tackle was dealing with managing multiple accounts. “Everyone had multiple accounts and multiple passwords, and we built our own home-grown system to consolidate these accounts, and make it easier for everyone to use a single username and password to get all of their work done. That actually improved security, since lessened the chance that someone would have to write down their multiple passwords somewhere — but it also made it easier to ensure that every employee had the right access to do their job.”

Of course, today we have both single sign-on products to federate identities, such as Okta and Ping Identity, and identity governance products such as Sailpoint and RSA Archer. But back then this was hard work.

Lanzi’s best security tool has been multi-factor authentication. “I turn it on wherever I can, it is truly one of the most under-appreciated tools around. While it isn’t perfect, this technology sits in that rare sweet spot between simplicity and security,” he said. In his present firm he uses a combination of Google Authenticator and Yubikey Nano devices for this purpose. “I am amazed at how much crypto they can cram into that Nano form factor,” he said, which is about the size of thumbnail (shown here).

A decade or so ago, Lanzi was involved in rolling out 110,000 iPads globally at Genentech/Roche. “At the time, it was the largest non-education deployment of iPads in the world, and we used the MobileIron’s MDM software to protect both our data at rest and in flight. Their MDM-based security capabilities gave us the ability to remotely wipe the fewer than 20 devices that were lost or misplaced each month. Its combined capabilities gave us assurance that when those devices were lost, the data on them was still secure. We could also enforce minimum OS version standards, to ensure that users were keeping them up to date with OS security updates.”

Genentech/Roche had a very unusual security staff, composed of folks from different departments. “We had separate teams for patching desktops, maintaining our network infrastructure, an IT Security policy writing group, an account provisioning engineering group for maintaining that piece, and an overall Security Architect as well. They contributed to an overall defense in depth because they were mutually supportive and worked together. That isn’t going to be possible in every enterprise, but we had terrific coverage across the various skills and potential threats areas. And given that we had personnel split across South San Francisco, Madrid and Basil, Switzerland, it was pretty impressive.”

How has security changed among his various employers over the years? “It really depends on the level of support at the executive level. At Genentech/Roche, we had executives who understood the risks and the investment needed to minimize the security risks. Other places were behind the curve and more focused on creating policies and lagged with their investment in security infrastructure. Part of the issue is that unlike in the retail or government sectors, biotech hasn’t had the big-news breaches to motivate organizations towards security improvements.”

Like what you are reading?

Subscribe to Inside Security!

Email Address

//s3.amazonaws.com/downloads.mailchimp.com/js/mc-validate.js// < ![CDATA[
// < ![CDATA[
// < ![CDATA[
//

The changing nature of IT security: Bryan Doerr, CEO at Observable Networks

Bryan Doerr has been involved with tech companies for decades, most recently leaving Savvis/Century Link as their CTO before agreeing help bootstrap Observable Networks. I asked him to reflect back on his career and where the infosec industry is headed in general. “There is a lot of security industry maturation still to come, a lot of wood left to chop,” he told me in a phone interview last week. “While there are still some pockets of maturity here and there, they usually are only found with the largest companies who can afford it.”

Looking back more than a decade, the biggest change has been being able to deliver security as a subscription service, he said. “First we had pre-built security appliances, but lately we have seen managed detection and response services,” such as what his company delivers. “And it isn’t just a change in how protection is delivered, but how the subscription service can be more affordable for mid-market customers.”

Another big change is how end user customers finally are getting some benefit from sharing threat intelligence. “No one wanted to talk about where or how they were attacked and share these specifics with anyone else,” he said. This intelligence sharing has made the subscription service vendors more potent and compelling and has boosted the ability to respond effectively to threats.

“Ten years ago security was built on a simple idea: that we know about our attackers and threats, and through some means we could prevent those bad guys from getting inside our networks. Back then, we had a limited number of threats, so we could more readily recognize and block them. That is so far from where we are today. The fundamental nature of what is a threat and how attacks use technology has changed completely. The idea of tracking attack signatures makes a lot less sense when every attack is unique.”

Doerr agrees that the days of the perimeter being the sole point of defense are also long over. As an example, he points out the recent IoT botnet attacks.

One benefit from the last decade has been the move towards increasing virtualization. “This absolutely was a positive influence, and helped us to better design and operate more secure systems and more complex infrastructure,” he said. Before virtualization, we had too many different fiefdoms dedicated to particular circumstances. Each one had different configurations and staffs who were maintaining them. All of that variation left us vulnerable.”

But with virtual machines, “a lot of automation has been brought to bear to keep a consistent environment running. That means we can provision VMs, kill them off, and recreate them easily. This makes it more efficient to scale up and down and we don’t have to spend our time patching systems.”

Another issue is the nature of modern network traffic. “Our networks are becoming increasingly encrypted, we can’t even see what is going on over the wire and view the payloads, and this adds another layer of difficulty. Right now less than half of all traffic is encrypted, but it won’t be long before it becomes 100%. We won’t be able to readily examine any of this traffic, which will make networks harder to defend and detect exploits.”

When he was at Savvis, one memorable experience was upgrading one of their data centers. Thanks to a routing bug the entire data center couldn’t come back online. “We tripped over it on a Saturday, and didn’t immediately understand what we were doing. It was easy to miss a single use case that caused the problem. That was a humbling experience and gave me an appreciation of the magnitude of the business that we had running. You don’t feel it until something terrible happens and you see how significant these outages are.” The situation drove home the point that he needed to stay in touch with his technology and understand that it is not just an abstraction, but also a very real entity.

I asked him who had the better job, the CTO or the CIO? He was firmly behind the CTO position. “CTOs will have jobs for forever, because they help organizations understand the evolution of technology and anticipate the direction of that evolution. The CIOs still have some soul searching to do.”

Like what you are reading?

Subscribe to Inside Security!

Email Address

//s3.amazonaws.com/downloads.mailchimp.com/js/mc-validate.js// < ![CDATA[
// < ![CDATA[
// < ![CDATA[
//

The view from a former state agency CIO: David O’Berry

David O’Berry was a former CIO at a state agency with 1000 employees, now he works for a security vendor. To give you an idea of his credentials, he has CISSP‑ISSAP, ISSMP, CCSP, CRISC, CSSLP, MCNE, CSPM and a CRMP!

He met his wife in college when a virus erased his senior thesis text and backups: luckily she was both a fast typist and a good sport. “That was by far the most expensive virus of my entire career!” Later on he had to attack another floppy-based virus, which was difficult because he had to run around the office finding infected disks and literally destroying them. He also faced down the Nachia/Welchia worm, which infected a PC that was not patched because the user was out on maternity leave.

“When I was a CIO, imaging software probably saved us the most time and had the strongest impact initially along with mail filtering products and endpoint management tools for remote control. Besides these products, I believe that standardization of what we did and how we did it had the single largest impact on our organization being able to progress as rapidly as we did with as limited resources as we had.”

For fighting insider threats, “you have to have contextually aware DLP and scanning products as well as what I call “Distributed Peer Review” by the nodes that attach to the environment.  Each node has to contribute to the survival of the organism by being a sensor in the larger scheme of things.” He has seen plenty of ransomware, and feels that “first and foremost it is a test of backup and recovery plans. Having a known-state in that area fell out of vogue for a while but now it is more important than ever even if it seems like boring blocking and tackle.”

At his current employer, “we do use MDM and they also allow BYOD. As a former CIO, we had not adopted BYOD when I left but had made the entire workforce mobile and managed it accordingly. We also had implemented Imprivata for its single sign-on package.”

When it comes to securing the cloud and his cloud-based servers, “there are similar challenges to what we have been pursuing since the dawn of time. Visibility is king.  Constructs that give you real-time visibility give you the edge over any other type of product when coupled with real-time mitigation and resilience.”

Now that he is on the vendor side, “I would say that the state of cybersecurity has gotten a lot worse since I made the jump because the pace of innovation and change has hit a vertical level and never stops. Malware creators have become more and more adept at how to attack the exploding number of devices. I believe we have a chance to get out in front of the next phase of this is, but to do so we have to share information in real-time as well as allow companies to participate without artificial barriers to entry. However, our window of opportunity is closing rapidly.”

Like what you are reading?

Subscribe to Inside Security!

Email Address

//s3.amazonaws.com/downloads.mailchimp.com/js/mc-validate.js// < ![CDATA[
// < ![CDATA[
// < ![CDATA[
//

The view from a non-profit CIO.

Being the CIO of a non-profit gives you an entirely different perspective in terms of managing people, resources, and technologies.

David Goodman would know. He has been involved with managing IT operations for different non-profits for most of his professional career. He used to be the CIO of International Rescue Committee, and currently is the CIO-in-Residence at NetHope, an umbrella organization that is a resource for some of the world’s largest non-profit aid organizations.

“The biggest challenge for non-profits about IT is that few people understand it in that context. We usually don’t have any roadmap or a sizable staff for how we are going to implement any new technology. Many organizations don’t have any dedicated infosec staff, or if they do they only have one person for this task.”

Often, IT takes a hit due to unplanned consequences that is more because of the where the non-profit is located than anything related to the technology itself. For example, he tells the story of a nonprofit that opened an office in a very insecure country. “We opened an office there to help benefit refugees, which is our mission. We made connections with the local militia to make sure that we were permitted to do this and didn’t have any issues until one day our office was overrun by the militia and our people were taken hostage. They didn’t like what we were doing. While that doesn’t happen too often, it was pretty scary for our staff and volunteers. They took all of our computing equipment. Eventually, we were able to get them to release everyone, although two Americans were held in a hotel for a few extra weeks.”

Planning for this situation is a challenge, as you might expect. But the office had no incident response frameworks, no security policies. “There were passwords written on whiteboards. There were staffers using personal Skype accounts to communicate with headquarters. Because all the laptops were stolen, the rebels were using the staff’s personal Skype accounts that were set to autologin and were sending messages impersonating the staff. They couldn’t easily shut down these personal accounts.” Eventually all personnel returned safely and everyone was accounted for. But they lost all their equipment: “that was never seen again.”

Few IT managers or CIOs have to deal with this kind of situation. “It is pretty nasty stuff, and it is because of the nature of how many international nonprofits operate and the places they have their offices are often in conflict areas. This means we don’t just worry about IT security, but the safety of the staff too.”

Here is another example. At one international nonprofit, he wanted to improve the organization’s password policies. The issue was that many of the staffers are scattered around the world and don’t regularly login to their enterprise Active Directory domain controller which meant that staff didn’t get regular notifications of expiring passwords. “So for the field staff, we set their domain passwords not to expire. As you might imagine, this wasn’t great infosec policy, so I tried to implement a better one that had complexity and change management built-in. I got buy-in from senior management and approval from the CEO. We were ready to implement it, and I sent a reminder email to some of the affected parties, including the CEO.”

Suddenly he scuttled the whole idea: “He told me that he had been using the same password for more than 30 years and wasn’t about to change it now. So the very straightforward and approved password policy was shelved, and there are probably still hundreds of people using non-expiring passwords around the organization.” Goodman couldn’t get him to understand why a better password policy matters.

All is not gloom and doom however. At NetHope, he is working with a number of major donors, including the Gates Foundation and MasterCard International, to create non-profit specific security controls that can be used for guiding IT auditing and compliance. “We will have a set of best practices on how to appropriately secure critical data, all based on existing standards like ISO, NIST, and PCI. We will also provide implementation guidance so that nonprofits without dedicated info sec staff — which is nearly all of them — will know how to implement these controls.”

Like what you are reading?

Subscribe to Inside Security!

Email Address

//s3.amazonaws.com/downloads.mailchimp.com/js/mc-validate.js// < ![CDATA[
// < ![CDATA[
// < ![CDATA[
//

The view from a small college CIO: Infosec is getting harder to do.

Ravi Ravishanker is the CIO and Associate Provost at Wellesley College in Massachusetts. He has been in IT for many years, and supports an organization with more than 1400 faculty and staff. I spoke to him in September 2016. “Information security has continued to be one of the highest priority for every one of the IT organizations I have worked for. The only difference is that it has become harder and its relative importance compared to the other things we have to do has gotten higher, which results in much higher resource allocation to security across the entire institution.”

He recalls back in 1986, when he began his IT career. He was writing code in assembler for a VAX VMS. This was done to make it faster to execute. “However, we made a programming error to have one user send a file to another using TCP/IP. Because of an internal security lapse, the students found out they could send someone else’s files using our program. It didn’t take long to fix the problem, fortunately.” Coming into the modern day, he finds that vulnerability scanners are one of his most important security tools. “This is because they expose vulnerabilities about network ports that shouldn’t be open. Similarly, scanners that test our web apps for a range of vulnerabilities are also essential.”

“We realize that given our limited resources, we have to be very diligent. First and foremost, data and network security needs to be a priority for everyone in the IT organization, not just a select group of security administrators. Also, security is a joint partnership between IT and our users; it is a shared responsibility of the entire the enterprise. If our users aren’t following best practices, they can expose our enterprise to data security issues. Security is a critical part of everything that we do.”

To date, he hasn’t seen much in the way of insider threats at the college. “People in higher education have a sense of loyalty to the institution, and we place a lot of trust in our employees. While insider threats are always a potential issue, we are in a space where it is minimal.”

The college has moved into the cloud and continues to increase its cloud footprint. “We try to do as much due diligence when we sign up with a new provider and make sure that they are giving us the security that we need. We thoroughly review the contracts and agreements from security and compliance perspectives before signing up with a provider.”

“We are a fairly small IT organization and currently our user services, which manages desktop support, and the systems and network groups are all under one director. This works really well in terms of information exchange between the groups and easy access to the systems and network engineers. However, we recently decided to reorganize this group and we hope that this relationship will be preserved because this relationship is critical from information security perspective.”

Like what you are reading?

Subscribe to Inside Security!

Email Address

//s3.amazonaws.com/downloads.mailchimp.com/js/mc-validate.js// < ![CDATA[
// < ![CDATA[
// < ![CDATA[
//

Learning from the US Secret Service how to protect your enterprise

With all the changes to infosec technology, here is a not-so-outrageous idea: maybe you should take a page from the US Secret Service playbook in how you run your IT security department. Actually, this idea didn’t come from me, but from someone who actually is familiar with both roles. Nathaniel Gleicher is trained as a computer scientist and a lawyer, and currently is the Head of Cybersecurity Strategy at Illumio, a security vendor. Previously, he prosecuted cybercrime at the US DOJ and served as Director for Cybersecurity Policy at the White House National Security Council. While he worked at the White House, he saw multiple data breaches. “Every breach relies on lateral movement, and instead of attackers being at risk once they get inside, they’re able to take all the time that they need to identify high value information and cause damage.”

He thinks organizations need to take a different, simplified approach and go back to the basics: get visibility inside the data center and cloud and then be able to truly lock the doors inside.

In a blog post for his firm, he writes: “Like the Secret Service, cybersecurity defenders face a similar problem: they are defending high-value assets that must be protected, but also have to speak to hundreds or thousands of other servers. You have to have visibility, and reduce your attack surface, and focus on the security consequences for your most valuable assets. Shutting down the attack surface constrains attackers, makes lateral movement harder, forces attackers to risk exposure, and makes other security tools more effective.”

Sadly, most organizations focus their cybersecurity spend today at the perimeter, making no effort to secure or even understand the interior of their data centers. After reading Gleicher’s post, I asked him if there is a difference between interior and exterior networks any longer. He told me in a phone interview, “Everything is a potential threat. One difference is that you can have greater control around an interior network. And your network visibility is much more limited with exterior ones. But that’s missing the point. An intruder can find something once they are inside your network and can look around. Organizations are trying to layer defenses at the fortress wall, while the cyber attackers are parachuting inside and then free to move around as they want inside the data center and cloud.”

He continued, “I still have conversations with CISOs that don’t know how their devices are connected to their networks. And I don’t mean just a list of these devices, but how they are related to each other, both logically and operationally. This is the kind of information that attackers can exploit.”

His work with the Secret Service has him focused on understanding some of these lessons from providing physical security to protect the President. “People don’t see the Secret Service advance work that was done months before any presidential visit. They had to map the location and understand the physical space. The same is true for cybersecurity, because we need to identify the attacker quickly and respond fast too. This means that any cybersecurity effort should start months before any potential attacker actually shows up.” In other words, it isn’t just about stopping someone from getting across the White House fence, but understanding what will happen once then enter the grounds and what they might end up doing.

He agrees that good security isn’t easy. And he started early in his career with his first IT job for the Peace Corps. There he created a created a campus-wide network to connect 85 machines that were located in the different buildings of a college on a Caribbean island. Less than five minutes after it was first connected to the Internet it was breached. It took him several tries to close various ports and other vulnerabilities before he could defend the network properly. “This was an early lesson on how hard it was to do security properly: there are way more people trying to get in than keeping them out. It also showed me that the steps to strengthen data security aren’t rocket science and are very straightforward. It is a lot more how to orchestrate them and use them efficiently across the enterprise.”

Instead of focusing on the lack of response, he says we should be doing a better job of evaluating the highest-value targets, which is another lesson he learned from watching the Secret Service in action. He said, “You shouldn’t be in the business of protecting the app that handles your employee’s lunch request.” And not everything in the data center should be treated equally, too. “There are some things in your data center that are more valuable and you have to focus on what needs the most protection. If a burglar gets into your house and gets into your basement that is different from him getting into your bedroom where you keep your jewelry.”

Like what you are reading?

Subscribe to Inside Security!

Email Address

//s3.amazonaws.com/downloads.mailchimp.com/js/mc-validate.js// < ![CDATA[
// < ![CDATA[
//