We have a rich history of running webinars at Veracode but it seems that recently we’ve been doing more in house and via partner channels. Going forward you’ll see a monthly update like this post detailing all our anticipated online events to hopefully make you aware of our webinars sooner and help you plan for attendance if you’re interested. Without further ado here’s the slate of webinars for the remainder of May!
Breached! App Attacks, Application Protection and Incident Response
Brief: This webinar will first explain software application vulnerabilities and define their various types. It will also present recent research findings about the prevalence of these vulnerabilities and their impact. From there it will discuss what organizations can do to harden their applications. Finally, the webinar will cover best practices for responding to a successful application attack.
- Date: Thursday, May 23
- Time: 1:00pm – 2:00pm EDT
- Host: Co3Systems
- Veracode speaker: Chris Wysopal
- To register for the event visit: https://www4.gotomeeting.com/register/901209343
The Intractable Problem of Software Security
Brief: We all know that applications are inherently insecure, yet some of the highest profile breaches in 2012 were the result of easily remediated coding flaws. These flaws persist in almost all the software that runs most websites and businesses; SQL injection alone affects 32% of web applications. If the current state of software security is any indication, we’ll continue to hear about major data breaches in 2013 and beyond.
Join Chris Wysopal, Veracode’s Co-Founder and CTO, as he discusses the current and future state of appsec. He will dive into the data that drive the predictions detailed in the Veracode’s fifth annual State of Software Security Report. This report pulls data from tens of thousands of live application scans performed on the Veracode Platform.
- Date: Friday, May 24
- Time: 1:00pm – 2:00pm EDT
- Host: Sans Institute
- Veracode speaker: Chris Wysopal
- To register for the event: https://www.sans.org/webcasts/intractable-problem-software-security-96655
Don’t Ask, Don’t Tell: The (In)Security of Vendor software
Brief: What vulnerabilities threaten the integrity of your software supply chain and data? Can your enterprise really influence software vendors to meet your most important security policies and remediate insecure software?
Action is needed, and urgently. An alarming 62 percent of all applications fail to reach compliance on their first submission, according to a study recently conducted by Veracode, Enterprise Testing of the Software Supply Chain. While few enterprises now have formal third-party testing programs, those that do find they dramatically improve vendor compliance while meeting industry standards.
- Date: Thursday, May 30
- Time: 9:00am – 10:00am EDT
- Host: BrightTALK
- Veracode Speaker: Chris Eng
- To register for the event visit: https://www.brighttalk.com/webcast/574/74823
I recently came across an interesting blog post by a team member at Acunetix that addressed a challenge many enterprises are facing when it comes to securing third-party components. This is a pretty hot topic in certain circles these days, and understandably so – studies have suggested that as many as 65% of an enterprise’s mission critical applications are developed externally. Additionally, Veracode research shows that a typical internally developed applications contains somewhere between 30% and 70% of externally developed code, indicating that even internally developed apps are utilizing code originating outside of their own walls.
Given these statistics, Mr. Beaver provides some great advice – involve team members truly at risk to make the risk vs. reward decision rather than leave the decision solely up to IT. However, the challenge of vendor risk management is growth significantly. In the past we’ve seen pre and post procurement assessments covering a variety of topics including the financial security of software vendors, background checks of employees, physical checks of vendor environments and scanning of perimeter components such as firewalls. Surprisingly, it is only in the last several years that we are seeing a rise in the number of enterprises making scanning of third-party software a part of the procurement process. At Veracode this effort is clear as we’re on pace to analyze, educate and help improve the security posture of software at over 1,000 vendors in 2013.
As application security scanning of third-party applications becomes a standard part of the procurement process, we will see the focus move towards the root cause of issues, indentifying code level flaws in applications and driving vendors to fix those. In fact, we currently share our best practices and lessons learned with the community to help improve their vendor relationships and simplify the scanning process.
While our main focus is on helping large enterprises drive security improvements in their vendor community, we understand that a comprehensive Vendor Application Security Testing (VAST) may not work for all vendors or those organizations that don’t already embrace the application security best practices in the supply chain.
For those teams looking to begin a vendor risk management program, I recommend the following:
- Clarify the Goal – A simple “Scan and we’ll tell you what to fix” request can be frustrating for vendors who don’t have a clear goal. Define a policy (such as removal of all flaws in the OWASP Top 10), testing techniques (dynamic, static, manual pen testing, etc) and reasonable timelines (think in terms of months, not days) in which vendors should fix their flaws.
- Understand Market Immaturity, But Drive Maturation – Some of the more mature software vendors have very impressive AppSec practices including developer education, static analysis integrated into the SDLC, routine dynamic and penetration testing and a variety of other activities and are typically very cooperative in fulfilling security requests. While these vendors are great to work with, they’re not the norm. Don’t punish those vendors who have not committed to building security in, but make it clear that they will be expected to do so in the very near future or face potential impacts on your business relationship.
- Disparate Responses are OK (for now) – It’s often the case that security is an afterthought for vendors – they’re typically worried more about the next release, new features and keeping the lights on than providing a secure product. The goal of vendor activities should be twofold:
- Gather information your organization needs to make better informed decisions.
- Drive better practices for vendors going forward.
- Tip: Providing vendors with options to do an initial questionnaire like the vBSIMM will get your team short term self-attested answers and drive them to adopting better long term practices. Being realistic but prescriptive in your requests will help not only get initial responses but also drive longer term adoption of best standard practices.
- Work With Your Peers – Vendors are typically selling to dozens, if not hundreds, of customers. If they have to fulfill a separate security requirement for each of those they’ll quickly become frustrated. Most industries have groups that are beginning to focus on AppSec best practices in terms of vendors (such as FS-ISAC in the financial services community). Investigate these groups and begin discussing how you can work together to drive standard goals for vendors.
- Involve Procurement – Nearly every vendor deliverable must go through a procurement process that includes a variety of requirements. Work with this team to include AppSec requirements in the RFP process and include contract language that requires participation in security analysis and remediation. We make our recommend language available for free at: http://www.veracode.com/services/build-security-criteria-into-contracts.html
Securing third-party applications is becoming an increasingly popular topic in the security community. Regardless of the type of solution, enterprises are realizing they don’t control or have any idea what the security posture is for many of the products they use to run their businesses. It could take several years to drive improved awareness and adoption of secure development practices across the broader vendor community, but if businesses begin by following the above recommendations, they are taking a huge first step in making sure the applications they use aren’t putting their business at risk.
Tomorrow Veracode co-founder and CTO/CISO Chris Wysopal, and Josh Corman co-founder of Rugged Software and Director of Security Intelligence at Akamai Technologies will be filming a video segment with Paul Roberts of The Security Ledger.
The trio will be chatting about a variety of topics trending in the Appsec field including but not limited to; recent changes to the OWASP Top 10, security of third party software components, and industry culture.
The two will also be answering a selection of questions submitted by members of our community. If there’s a question you’d like answered by either Chris or Josh you can submit it in the comments on this post or tweet it either @Veracode or use the hashtag #TalkingCode! Please submit your questions for consideration before 12pm EST on Friday May 17th.
The video will make it’s debut on The Security Ledger in June, we’ll let you know as always once it’s live.
A large-scale survey of IT security professionals found that application security is the most pressing security problem facing them, beating out malicious software and mobile devices, according to a survey released by (ISC)2 and Frost & Sullivan.
The 2013 (ISC)2 Global Information Security Workforce Study ranked application security issues at the top of a list of survey – the same place it occupied in a similar survey in 2011. Application vulnerabilities were listed as a “top” or “high” concern for 69 percent of survey respondents. That’s a slight dip from 2011, when 73% of respondents named that as their top security threat. Malware, including viruses and worms, moved up to the #2 spot, with 67 percent of respondents listing it as a “high” concern or their “top” concern.
Now, if you follow security for any amount of time, you know that there are all kinds of surveys. Vendors survey their customers. Publications survey their readers. Random web sites survey random collections of folks who visit their site. There’s a lot of variability and survey results should always be taken with a grain of salt. That said, the (ISC)2 survey has some weight to it. First of all, it was conducted with the help of professionals (Frost & Sullivan as well as Booz Allen Hamilton), not the Director of Marketing of SecurityStuff Inc. Second, the sample population is large: 12,000 information security professionals.
The survey data concerning application security is revealing. Those surveyed didn’t just say they worried about the threat posed by application vulnerabilities, they also acknowledged that much of the blame lay within their organizations. Forty one percent of those surveyed listed “applications and system development security” as their second most urgent training need, after “information risk management” (the choice of 47% of those surveyed). “Many organizations have come to the realization that their own internally created software suffers from the same security risks as those coming from a vendor.”
When asked what aspects of software development held the most security concerns for them, respondents said it was early stage development that concerned them the most – not QA. Eighty one percent listed software “design” as the development task in need of better security, followed by “Specifying requirements” and “Testing, debugging or validation.”
That probably shouldn’t come as a surprise. IT security professionals are increasingly involved in software development. Fully 22% of those who took the (ISC)2 survey said that they were “personally involved in software development.” Of those respondents in the Americas region, the figure was 24%.
As this blog (and others) have noted on many occasions, more secure application development starts with better training for would-be application developers. That’s especially true as more and more applications make use of shared and open source components that speed development, but often at the price of security.
Chris Wysopal on Tuesday wrote about SAFECode, a program spearheaded by Adobe to offer free application security training to developers. With almost three quarters of applications submitted to Veracode failing to comply with enterprise security standards, any help is appreciated. The (ISC)2 survey is a reminder that, as the ranks of those involved in software development swell, so does the need for education about application security.
Our entire Research team is in town this week for a round table catch up and this fun artist’s rendition of them materialized. Given that I haven’t personally met them all I was unable to identify a few of them by these cartoons. I figured I’d turn to our trusty community to help me out, comment below if you you think you know an avatar’s human counterpart with the number next to them and their full name.
Who is this tablet toting technomancer?
The first to correctly identify any Veracoder will be credited below by name or Twitter handle!
- Isaac Dawson (@_wirepair) by @c0ntr0llerface
- Chris Eng (@chriseng) by @hackerhuntress
- Brandon Creighton (@unsynchronized) by X30n
- Fred Owsley (@fredowsley) by @gwenkrauss
- Ryan O’Boyle (@523) by @c0ntr0llerface
- Bonus: Melissa Elliott (@0xabad1dea) by @hackerhuntress
Filed under: ALL THINGS SECURITY, application security, SDLC, Software Development
A developer’s main goal usually doesn’t include creating flawless, intrusion proof applications. In fact the goal is usually to create a working program as quickly as possible. Programmers aren’t security experts, and perhaps they shouldn’t be. But when 70% of applications failing to company with enterprise security standards (data from Veracode SoSS vol 5), it is clear more attention needs to be given to secure programming techniques.
This is why when I came across an article describing a new training program by the Software Assurance Forum for Excellence in Code (SAFECode), I was pleasantly surprised. The organization, led by Howard Schmidt, will offer training courses for “anyone that does development work”. The first six training courses will focus on web application security flaws such as SQL injections and Cross Site-Scripting.
I haven’t had a chance to view the full curriculum, but I have confidence in the security pros at Adobe, have put together an excellent program. Web application security flaws are some of the easiest flaws to avoid and most exploitable, yet they are also some of the most common flaws, so I think starting program with lessons on web applications is a great first step. It is an extra bonus that the material will be Creative Commons licensed which should allow for wide distribution. The free on demand training courses are available at:
The security industry needs more programs like the training from SAFECode. When combined with integrating security testing and scanning into the software development lifecycle (SDLC), these programs will help create less vulnerable applications and reduce the number of successful attacks using well known vulnerabilities. While it seems like most people agree on these points, the need for speed has somehow made slowing down to consider security during the development process uncool. This is especially true when programmers don’t have as many resources at their disposal, for example, when developing open source applications. It is as if acknowledging that you may have security flaws in your code is the same thing as admitting you aren’t a true programmer. This couldn’t be farther from the truth. Even the smartest, most innovative programmers can create software with flaws because they are human and imperfect, just like the rest of us.
Offering free training courses and materials on secure coding will hopefully serve a dual purpose. My first hope is that it will help programmers use more secure coding practices. The second is that it will eliminate the taboo of admitting (during the development stage) that an application could have security vulnerabilities. Only then can flaws be remediated before the program is released.
Filed under: application security, Binary Analysis, research
Everyone has had that dreaded experience: you open up the task manager on your computer… and there’s a program name you don’t recognize. It gets worse when you google the name and can’t find a concrete answer on what it is and why it’s there. It gets even worse when you remove it from Autoruns and it comes back. It gets terrible when you realize it has keylogger functionality. The icing on the cake, however, is when the mystery program is also eating up all your RAM.
The RAM issue is actually how this special little program on my own computer came to my attention. I recently bought a high-end Windows 8 tablet – to protect the guilty, we’ll call the manufacturer “Spacer”. Like most Windows computers, it came with an assortment of apps preinstalled by “Spacer”, ranging from the mildly useful to trash you delete without hesitation. In particular, I liked the interface that popped up when I plugged into HDMI, so I didn’t go on a vendor utility murdering spree.
I happened to have Resource Monitor open, and I noticed that the second-most RAM-hungry program was… a “Spacer” background service with a generic name, consuming 280MB. Not bad for a 15KB binary! Googling the name, “MEMS Enhancement Utility”, only turned up other customers wondering what it was and observing that getting rid of it didn’t seem to break anything. I disabled it and rebooted, but it came back. Presumably, one of the “Spacer” apps was serving as a watchdog for the others. The easy solution is to simply get rid of the program all together, but I decided to investigate what made this program so important in the first place.
Figure 1: Not the most clarifying metadata
It turns out that the program was written in .NET, which is vastly easier and faster to reverse-engineer than conventional native binaries. At Veracode, we have our own internal tools for automated analysis of .NET programs, but for interactive purposes, I recommend the free JetBrains dotPeek.
When starting an investigation of a binary, I like to take a quick tour of bundled functionality.
Figure 2: Imported Namespaces
Aside from the typical imports, Windows7.Sensors is a fairly self-explanatory name, and is in fact just a sample code kit off MSDN for reading the tablet’s accelerometer. That’s interesting but rather benign functionality. Far more… concerning is the member variables and methods of the “gma” namespace.
Figure 3: Consider my eyebrows raised
This is, of course, the classic sign of a userspace keylogger, but for every keylogger out there, there’s a hundred legitimate apps who hook the keyboard and mouse for perfectly sensible reasons; otherwise, why would it even be in the standard Windows API? I was, however, beginning to question the provenance of this application.
The actual logic of the utility, however, was… puzzlingly brief. It initiated a nearly-empty form and hid it. It set up handlers to receive keyboard, mouse, and accelerometer activity. It then set up timers to poll the accelerometer based on how long it’d been since the last keyboard or mouse activity. And that’s it. That’s all. The application does not store or transmit or even display the information polled. It does nothing. I spent the better part of two hours scouring the obscure corners of the binary, thinking surely I must be missing some cleverly hidden method that actually uses this data. I couldn’t find one.
Putting aside this issue for now, I couldn’t help but think: why in the world is this little app ballooning into hundreds of megabytes of RAM? That’s usually the sign of a runaway memory leak, but in a pure .NET application, such things are actually difficult to cause, whereas in a C/C++ program they’re very difficult not to cause.
The answer lies in the fact that the sensor-reading DLL uses marshaling to interact with native APIs, and actually calls traditional memory allocation routines. Hence, every time the accelerometer is polled, manual allocations are made that may or may not ever be manually freed depending on control flow.
Figure 4: One of the places where the sensor glue code indulges in manual memory management
Violently shaking the tablet (is this thing still under warranty?) causes the RAM usage of “MEMS Enhancement Utility” to spike, but not all at once, as the accelerometer reader is going off at timed intervals rather than constantly. The memory usage will balloon by several megabytes every time I shake the tablet, and after a few minutes, some of it will be reclaimed by the garbage collector but some will not. Hence, the base RAM usage of the process steadily creeps upwards.
So now we know what the program does and why its memory usage is so high, but that still leaves the question of why it’s doing this at all. The clues are there, vestigial remnants of removed code, exciting to any Executable Archaeologist:
Figure 5: Declared variables of the main form
The generically named “Form1” of the application contains several widgets which are never actually displayed: a start button, a stop button, a place for displaying mouse coordinates, and a text box for displaying some other unspecified data. I believe this was originally a debugging utility used by “Spacer” engineers to calibrate the accelerometer so that it would not go off when one simply tapped on the touchscreen (triggering a mouse event or keyboard event). They didn’t bother to rigorously prevent memory leaks because it was never intended to run for more than a few minutes at a time. Somehow, through some miscommunication, a copy of this program with the logic for rendering the visuals stripped ended up on the list of utilities that needed to be kept in the final version of “Spacer’s” Windows 8 image for this model of tablet. Someone then dutifully registered it to be launched in the background every time the tablet boots, and every time the tablet is tilted, shaken, or prodded a little too hard, the RAM usage goes up.
Never attribute to malware what can be adequately explained by a few lines of debugging code somebody forgot to disable.
Warnings about the security of medical devices often get passed off as just more “FUD.” But the case of serial killer Charles Cullen shows that arcane application security issues can literally be matters of life and death.
Cullen, you may recall, is the career nurse and former Navy electronics technician who admitted to a 16 year-long killing spree comprising 40 murders, all of hospital patients under his care, though experts familiar with the case believe the total death toll may be several hundred patients. That would make Cullen, the subject of a recently released book, The Good Nurse, the most prolific serial killer in American history.
An article in Wired by that book’s author points out, however, that Cullen’s crimes continued for so long, in part, because he proved adept at manipulating flaws in medical device design to obtain the drugs he used to kill his victims. In particular, Cullen is alleged to have exploited an application design flaw in a then-new device, Pyxis Medstation, a medication distribution and management product made by the company Cardinal Health.
As author Charles Graeber notes in his Wired article, Cullen’s technical background made it easy for him to become an expert on Pyxis, which distributes drugs to nurses and tracks withdrawals, linking each with the account of a particular patient and nurse to create a record.
Homicide detectives studying Cullen’s Pyxis records didn’t see a smoking gun — a clear pattern of drug orders by him corresponding to the hospital overdoses. What they did find, however, were lots of canceled orders.
“Cullen had realized that if he placed an order of the drug for his own patient, then quickly canceled it, the drug drawer popped open anyway. He could simply take what he wanted without recording it in the system. It was that easy,” Graber wrote.
In short: Cullen had discovered what application security folks call a “race condition” in the Pyxis – a flaw in the underlying application logic that opened a small, but exploitable, gap of time between two inputs that left the device open to tampering. In this case, the inputs were placing the order for a drug (thereby opening the Pyxis drug door) and cancelling that order (keeping the door locked).
Cullen figured out that he could pop the Pyxis drug door by issuing, then quickly cancelling an order for a drug. The drug order is only listed as a cancellation, but he gets access to the pharmaceuticals he needed to murder a patient under his care.
That’s a real head-slapper, but it’s a nice illustration of the high stakes when it comes to medical device security. Of course, serial killers like Cullen are one in a million. But substance abuse by nurses or doctors is a much more common problem, and the Pyxis race condition would work just as well for them.
Even without Cullen as a poster child, the security of medical device hardware and software is going to get a lot more attention in the coming months. Just this week, for example, The Department of Homeland Security warned that medical devices pose a significant risk to the security of healthcare organizations and the sanctity of patient data.
In a May 4th bulletin, DHS warned that rapid adoption of features like wireless network connectivity and remote management make medical devices greatly increase the “attack surface” for hospitals and other healthcare organizations, while existing regulations do a poor job of addressing – or even assessing the security of medical devices, DHS warned.
Among the problems cited in the DHS bulletin:
- The U.S. Food and Drug Administration – which is authorized to approve medical devices for use – focuses on device safety, but not security. Issues around configuration and device security have traditionally been out of scope for The FDA.
- The rapid adoption of wireless networking and its use connecting remote medical devices to health IT networks has opened doors to attacks on medical devices, and attacks that use vulnerable medical devices as stepping stones to other network resources.
- Similarly rapid adoption of mobile devices like smartphones and tablets within healthcare settings introduces the possibility of patient data loss through insecure network connections and sync operations. Further, unmanaged mobile devices connected to health IT networks are a possible source of attack and compromise.
DHS’s prescriptions for fixing the medical device security issue were what you’d expect: the presence of actual security features on products (sad you have to ask for it, but….), as well as secure deployment: layered protections, strong passwords, user least privilege, patching.
The Cullen case suggests that, even with those protections, danger lurks. DHS would do well to make strong recommendations for application audits and other assessments of underlying code security part of the requirements that medical device makers must satisfy for customers – or regulators.
UBM Tech Director of Content, Jonas Tichenor, interviews Evan Fromberg, Senior Director of Channel Sales and Business Development at Veracode. A transcript of the interview is available below the embedded video.
Jonas Tichenor: Hi I’m Jonas Tichenor Director of Content for UBM Tech channel and joining us today is Evan Fromberg, Senior Director of Channel Sales and Business Development at Veracode, so tell me specifically what is Veracode?
Evan Fromberg: Veracode as a company is a on demand platform that helps enterprise customers understand risks in software applications they developed or they bought from third parties. So as application security or really as threats continue to emerge and businesses are more reliant and interconnected through software, there’s potential risks in securing that software.
Jonas Tichenor: Describe to me some of the key problems that Veracode is helping to solve today?
Evan Fromberg: Today mostly enterprise customers have reached out to us saying they have a problem, is that they’re increasingly interdependent on software they buy or that they develop or purchase from third parties. They need a way to understand how they can continue to develop and push new solutions to market without slowing down the development of that but ensuring that they’re secure at the same time.
Jonas Tichenor: Does that include everything from cloud based solutions to mobile applications, the whole 9 yards?
Evan Fromberg: Yeah as you can imagine with mobile applications exploding in use and with web applications really creating an interdependence and an increasing need to be connected all the time, businesses are more and more reliant on software in both mobile and web applications. So for example with BYOD (bring your own device) and mobile applications that’s really a key area that we’re helping enterprises understand risk.
Jonas Tichenor: So exactly who are the adopters of some of this technology and why do you think that is?
Evan Fromberg: Traditionally in the solution provider space, application security as a market has been challenging. There’s a finite group of resources that have the skill set to really test and understand if there’s security flaws in applications. I think it’s a tremendous opportunity. What we’re seeing is many solution providers want to help their customers find innovative and unique solutions but they haven’t had the skill set to do so, so you know our value proposition and what we’re we’re seeing in the market is leverage automated on-demand platforms such as Veracode to help solve unique problems for your customers without having to find the scarce resources that has traditionally been hard for solution providers to to acquire.
Jonas Tichenor: So what role do you feel like mobility plays in this application security space?
Evan Fromberg: We’re an on demand SaaS platform, all the benefits that people want about software-as-a-service or cloud, resources on demand, pay as you go, use them when you need them and essentially we do that. So we help organizations understand where an application software they’ve either developed or bought from third parties where they’re most likely to potentially be breached through flaws in those applications. The hackers understand exactly how to leverage coding or development flaws to gain entrance and we’re they’re they give enterprises that viewpoint as to where they’re most likely to be breached.
Jonas Tichenor: So you show them where, do you also help them figure out how to how to put the stop in?
Evan Fromberg: That’s a great question. We give them the visibility to where they’re most likely to be breached, where potential vulnerabilities exist, we give them pinpoint accuracy as to how they should fix that, what these coding flaws mean and help them understand exactly from an application perspective where they should fix it. Then together with our solution provider communicate there’s a great opportunity to say where should I start first, how do I build a remediation strategy and what’s the best use of my time to deal with the most pressing vulnerabilities first and deal with the others later.
Jonas Tichenor: So that fits nicely into this next question, where do you see the opportunities in the channel around the specific security space?
Evan Fromberg: I think there’s a couple. I think when we look at the channel today there’s solution providers that are delivering solutions in technologies to help protect sensitive data and there’s also a subset of solution providers or ISVs who are developing software as part of how they they go to market. I think there’s a great way to extend traditional security techniques of protecting sensitive data for the channel to get in a market that was traditionally hard for them to do so. I think there’s an opportunity to differentiate for those people that are building custom software to show that they’ve gone through the extra steps to make sure that that software has been independently verified by a third party such as Veracode.
Jonas Tichenor: Give me specific example of how Veracode has helped the solution provider or independent software vendors today?
Evan Fromberg: Today we have a global partner program and we have over a hundred partners that are in application security, are helping secure applications but have hard time scaling their business so they turn to us to help increase the scalability, decrease the amount of time it takes them the complete projects, and really capture revenue and solve problems for customers in a segment of security that was challenging for them. Specific to third-party software vendors or independent software vendors, we help them provide independent verification showing we’ve gone through the due diligence to know that the software they’ve produced is free of security flaws, that’s making it more likely to be adopted by their end users out there to marketplace.
Jonas Tichenor: Hackers are smart. Once they find the hole they know where to go the next time so it’s interesting that you guys figure and help pinpoint where to put that stop in.
Evan Fromberg: While it sounds easy, it’s really one of the key areas today that’s leading to breach and sensitive data loss. Trustwave recently did a survey and they said seventy six percent of breaches were from some component of third-party software and it’s, we think because of the complexity because as challenging and disparate development of software is, it was a hard problem to solve and we think we’re helping our own way to make that a lot easier.
Jonas Tichenor: Do you have any advice for solution providers out there in the community that might not be in the application security space but certainly want to take advantage of the opportunity?
Evan Fromberg: I think it’s a logical step in how organizations, specifically enterprise customers will look to protect sensitive data moving forward. It’s absolutely an area with as we’ve talked about the use of mobile and web applications that they need to consider and that it’s a skill set that is challenging to find. You’ll find an automated platform and a SaaS model like Veracode, where they can partner not only to help not only grow their business but meet the needs of customers who who are looking for innovative solutions.
Jonas Tichenor: Evan any final thoughts before we wrap up for today?
Evan Fromberg: We’ve had our partner program for over two-and-a-half years, we’re global in nature and it’s very easy to get started with us due to our on-demand SaaS model. There’s no installing software or hardware and it’s very easy to to leverage the capabilities that we’ve brought to market .
Jonas Tichenor: Evan Fromberg, Senior Director of Channel Sales and Business Development at Veracode, thanks so much for being with us today.
Evan Fromberg: My pleasure, thank you.
Veracode has been beating the drum about the inherent danger of “third party” code in application development. Whether that code is “shrink wrapped” and supplied by a third party firm or open source, our research has shown that it often comes chock full of security holes – some of them exploitable.
Now a report by the firm Sonatype reinforces that message. Sonatype’s survey of 3,500 developers (PDF format report here) found that use of open source software is exploding in the application development community. Alas, much of it is unchecked, with few if any controls over what- or how components are being used.
The Sonatype Open Source Software Development Survey, released Tuesday, studied the way that organizations adopt, use and support open source software. The survey found that open source use is skyrocketing, with applications now more than 80% “component-based.” But 76% of organizations surveyed admitted they have no formal policy in place to manage or track the use of those components.
“The lack of internal controls and a failure to address security vulnerabilities throughout the software development lifecycle threatens the integrity of the software supply chain and exposes organizations to massive, unmanaged risk,” Sonatype warned.
According to Sonatype, which operates a Central Repository from which open-source components can be downloaded, use of those components exploded in 2012. Sonatype’s Central Repository registered eight billion component downloads, an 800 percent increase in activity since its inception.
Furthermore, nearly 80 percent of the organizations surveyed by Sonatype reported components found in Sonatype’s Central Repository were “important or critical to their development efforts,” with 86 percent claiming that their applications were “80 percent open source with the remaining 20 percent custom components and code.”
But much of that adoption is willy-nilly. Of the large organizations surveyed by Sonatype (defined as companies with more than 500 developers), 76 percent said they have “no control over what components are being used in software development projects.” Fully 65 percent of those companies said they don’t maintain an inventory of components used in production applications. Sonatype said 57 percent of those surveyed “lack any policy governing component usage,” while those that do have policies in place admitted, “enforcement is a challenge and not a top priority.”
Why? Big surprise: developers cited the tendency of such checks to slow development as the major reason they were not adopted, as well as unclear or inconsistent enforcement of policies around component use.
Sonatype found that more than half of survey respondents from large enterprises reported that developers “don’t focus on security at all,” with one in five of those saying they “don’t have the time to spend on it.” Just one in four of the survey respondents said they work at an organization that requires them to prove the components they use do not have known vulnerabilities.
To be sure, Sonatype has a dog in this fight. At the same time they announced the findings of their survey, the company announced “Sonatype CLM” – a Component Lifecycle Management solution that can “secure the entire component lifecycle.” Coincidence? I think not.
That doesn’t mean, however, that we should ignore the findings of the company’s survey. In fact, the results jive with what Veracode has reported in its own State of Software Security Report in recent years.
In the most recent edition of that report, Veracode disclosed that the average enterprise has 600 mission-critical applications. Around 65% of those are developed externally, leaving companies increasingly vulnerable to the security risks found in these apps. In fact, Veracode found that between 30 and 70% of applications that are thought of as internally developed by organizations are actually comprised mostly of third-party libraries and components.
“The widespread adoption of third-party apps and use of external developers in enterprises brings increased risk,” Veracode’s Vice President of Research, Chris Eng, said at the time.
It’s long been recognized that organizations take on substantial, but hard-to-quantify risk by trusting their third-party software suppliers to develop applications that meet industry and organizational standards. Organizations embracing agile development methodologies by leaning heavily on open source components are similarly exposing themselves and their customers in ways that they may well not comprehend.
But the attention paid to the security of third party components may start to get a lot more attention. In the recently released Top 10 Application Security Vulnerabilities from OWASP specifically calls out vulnerable third party components as one of the top 10 issues (A9). Specifically calling out vulnerable, packaged components may get organizations to actually recognize a risk they’ve been happy to ignore. However, its likely that larger changes – cultural changes – will be needed to bend the priorities of software publishers from speed and time to market towards security and the quality of finished goods.