Last night our CTO and Co-Founder Chris Wysopal joined Fox Business’ The Willis Report to chat about medical record privacy in a segment titled “Digital Records Putting Your Health Information at Risk?”
In the six minute segment Chris talks about “the dark side” of putting medical data online in cloud servers. Among the stats thrown around;
- 50% of doctors offices put customer data online,
- 80% of hospitals put customer data online,
- 21 million people had electronic records stolen in last 3 years,
- 94% of healthcare companies report data breaches.
Staggering numbers no doubt, you might be asking exactly how dangerous is this information? Health insurance fraud, financial identity theft, credit risk and even personal endangerment. If a someone undergoes a medical procedure under your identity, your medical records become flawed. In a scenario where you’re undergoing emergency procedures your records could say you’ve had your appendix out when in fact you haven’t.
Beyond personal data privacy concerns are medical device security concerns, a topic we’ve previously touched upon. Wysopal on the subject says, “The medical device problem is particularly scary because you have these devices which were standalone and now you’re adding wireless functionality to them…so you can monitor these devices and connect to them. A lot of them weren’t designed with security in mind.” All of a sudden these devices that were designed to only be accessed physically in person are now being exposed to attackers online, Wysopal also adds to the commentary, “It’s also hard to fix these medical devices and update them because there’s such a long certification process..they aren’t like typical IT systems that you can patch in a few hours.”
So what can you do to protect yourself?
- Ask your health insurance company for a copy of your medical record and activities.
- Pull your credit report at least once a year and verify all accounts and activity.
If you don’t recognize something on one of these two reports, raise a red flag immediately starting with your healthcare provider. Check out the full video here for more great information.
Nothing’s free in this world, especially not when it comes to security. With Twitter officially cramping your style, you are now forced you to waste precious seconds you could be tweeting, by instead waiting for a verification code to be delivered to your phone just so you can login.
The thing about options is that you have them…and options tend to let people remain lazy. Options also carry consequences which never make sense until they actually happen to you. That being said, Twitter gives you the option to activate two-factor authentication, but first…you are going to have to link a phone-number to your account.
As the plot thickens, it also doesn’t yet scale for those with the biggest targets on their backs. Media outlets cannot afford to sacrifice the coverage they get with multiple users on staff for a little bit of security….but this is only the first round from Twitter, as they have informed us all to “Stay tuned”. So maybe it is less likely we will be seeing tweets announcing Justin Bieber’s birth to Siamese monkey twins at the Anne Frank House in the coming weeks, but knowing your Twitter account is (more) secure is worth it, right?
I know we all love the instant gratification that comes from the massive amount of irrelevant nonsense Twitter delivers around the world; the very concept of a tweet is that thoughts and opinions (assuming they are <140 characters) are available to all of your loyal followers just as quickly as you can get them out.
Keep fighting the good fight my friends. Until next time, “help us, help you”.
The Internet of Things is upon us. We’re at the dawn of a new era that will bring changes even more transformational than those of the past two decades. There’s just one big obstacle in the way: IT and application security.
That’s the conclusion of a couple recent studies that consider adoption of key features of the “thing” based Internet, such as machine-to-machine communications.
In the first, the analyst firm ABI Research predicted that the market for machine-to-machine (M2M) communications has the potential to reach $198 billion by 2018. The market will be driven by the adoption of a wide range of “smart” technology – from household appliances to medical equipment to mobile devices.
However, the development of the M2M market may be constrained by what ABI calls a “consistent lack of interoperability” between devices with M2M interfaces, and a lack of application level protections. “M2M devices themselves are generally left unsecured and, as they increasingly connect to enterprise backbones, such exposure poses a risk, providing a vulnerable back door into the network,” wrote ABI researcher Michela Menting in a report.
Another red flag came in the form of a survey of 1,300 German businesses and universities that are members of the German Association for Electrical, Electronic and Information Technologies (or VDE). That survey asked about adoption of smart manufacturing technologies – or “Industry 4.0,” – as it’s known. Industry 4.0 is kind of a catch-all term (more common in Germany) that subsumes a lot of trends: ubiquitous sensors and computing, IT enabled machinery, process innovation, Internet of Things, etc. The benefits of the adoption of this technology are self evident: more efficient factory floors, real time troubleshooting, more efficient supply chain operations and rich data for executives, factory managers and customers about the entire manufacturing process.
Still, the VDE survey found that close to three quarters of those surveyed (70%) doubted that smart manufacturing goals would be achieved by 2025. Why? IT security was the most oft-cited obstacle, with 66% of those surveyed saying a lack of proper security controls was reason to hold back on investments in smart manufacturing technology.
And the cautious Germans aren’t the only ones who would like to put the brakes on the breakneck pace of adoption for intelligent devices. In just the last few weeks, the U.S. National Highway Traffic Safety Administration (NHTSA) expressed that it was concerned that the increasing complexity of electronic systems and sensors in automobiles warranted more scrutiny by the government in order to ensure passenger and driver safety.
“With electronic systems assuming safety critical roles in nearly all vehicle controls, we are facing the need to develop general requirements for electronic control systems to ensure their reliability and security,” David Strickland, the chief Administrator for the National Highway Traffic Safety Administration (NHTSA) told a Senate Committee. Modern automobiles are ”becoming increasingly interconnected, leading to “different safety and cyber security risks,” he said.
The story is the same in other industries as well – like healthcare – where smart device adoption is raging ahead, but sensitive information is at even greater risk.
So what’s to be done? Let me digress by noting that I attended a design-focused symposium this week called “Bytes and Atoms” that was co-sponsored by O’Reilly Media, the digital media firm Brightcove and others. The symposium addressed the challenges of the growing interconnections between our digital and physical selves. The speakers – many of them professional design consultants, architects and technologists – talked up the potential of The Internet of Things, likening it to the dawn of the World Wide Web in the early 1990s.
The talks were inspiring – we saw prototypes of ‘intelligent’ work out shirts by Under Armor that will monitor your body’s performance as you’re working out, non-invasive “smart” medical devices and telepresence robots that can help provide care to patients in remote locations.
But it struck me at the event that, as with the early days of the Internet and the Web, security was being shunted off to the corner of the stage. Now, as then, the decisions we make will sow seeds that we only harvest in five, ten or twenty years. We can see now how a lack of security in core Internet protocols like TCP and SMTP have given birth to a wide range of online ills, from spam to denial of service attacks, but risk making the same mistake by focusing on what ubiquitous sensors, cloud based infrastructure and powerful mobile devices can do, rather than how they do them. I think there’s a clear need for greater scrutiny of the security of hardware, software and communications protocols that will undergird the Internet of Things – lest we fail to learn the mistakes of the past and, therefore, doom ourselves to repeat them.
We have a rich history of running webinars at Veracode but it seems that recently we’ve been doing more in house and via partner channels. Going forward you’ll see a monthly update like this post detailing all our anticipated online events to hopefully make you aware of our webinars sooner and help you plan for attendance if you’re interested. Without further ado here’s the slate of webinars for the remainder of May!
Breached! App Attacks, Application Protection and Incident Response
Brief: This webinar will first explain software application vulnerabilities and define their various types. It will also present recent research findings about the prevalence of these vulnerabilities and their impact. From there it will discuss what organizations can do to harden their applications. Finally, the webinar will cover best practices for responding to a successful application attack.
- Date: Thursday, May 23
- Time: 1:00pm – 2:00pm EDT
- Host: Co3Systems
- Veracode speaker: Chris Wysopal
- To register for the event visit: https://www4.gotomeeting.com/register/901209343
The Intractable Problem of Software Security
Brief: We all know that applications are inherently insecure, yet some of the highest profile breaches in 2012 were the result of easily remediated coding flaws. These flaws persist in almost all the software that runs most websites and businesses; SQL injection alone affects 32% of web applications. If the current state of software security is any indication, we’ll continue to hear about major data breaches in 2013 and beyond.
Join Chris Wysopal, Veracode’s Co-Founder and CTO, as he discusses the current and future state of appsec. He will dive into the data that drive the predictions detailed in the Veracode’s fifth annual State of Software Security Report. This report pulls data from tens of thousands of live application scans performed on the Veracode Platform.
- Date: Friday, May 24
- Time: 1:00pm – 2:00pm EDT
- Host: Sans Institute
- Veracode speaker: Chris Wysopal
- To register for the event: https://www.sans.org/webcasts/intractable-problem-software-security-96655
Don’t Ask, Don’t Tell: The (In)Security of Vendor software
Brief: What vulnerabilities threaten the integrity of your software supply chain and data? Can your enterprise really influence software vendors to meet your most important security policies and remediate insecure software?
Action is needed, and urgently. An alarming 62 percent of all applications fail to reach compliance on their first submission, according to a study recently conducted by Veracode, Enterprise Testing of the Software Supply Chain. While few enterprises now have formal third-party testing programs, those that do find they dramatically improve vendor compliance while meeting industry standards.
- Date: Thursday, May 30
- Time: 9:00am – 10:00am EDT
- Host: BrightTALK
- Veracode Speaker: Chris Eng
- To register for the event visit: https://www.brighttalk.com/webcast/574/74823
I recently came across an interesting blog post by a team member at Acunetix that addressed a challenge many enterprises are facing when it comes to securing third-party components. This is a pretty hot topic in certain circles these days, and understandably so – studies have suggested that as many as 65% of an enterprise’s mission critical applications are developed externally. Additionally, Veracode research shows that a typical internally developed applications contains somewhere between 30% and 70% of externally developed code, indicating that even internally developed apps are utilizing code originating outside of their own walls.
Given these statistics, Mr. Beaver provides some great advice – involve team members truly at risk to make the risk vs. reward decision rather than leave the decision solely up to IT. However, the challenge of vendor risk management is growth significantly. In the past we’ve seen pre and post procurement assessments covering a variety of topics including the financial security of software vendors, background checks of employees, physical checks of vendor environments and scanning of perimeter components such as firewalls. Surprisingly, it is only in the last several years that we are seeing a rise in the number of enterprises making scanning of third-party software a part of the procurement process. At Veracode this effort is clear as we’re on pace to analyze, educate and help improve the security posture of software at over 1,000 vendors in 2013.
As application security scanning of third-party applications becomes a standard part of the procurement process, we will see the focus move towards the root cause of issues, indentifying code level flaws in applications and driving vendors to fix those. In fact, we currently share our best practices and lessons learned with the community to help improve their vendor relationships and simplify the scanning process.
While our main focus is on helping large enterprises drive security improvements in their vendor community, we understand that a comprehensive Vendor Application Security Testing (VAST) may not work for all vendors or those organizations that don’t already embrace the application security best practices in the supply chain.
For those teams looking to begin a vendor risk management program, I recommend the following:
- Clarify the Goal – A simple “Scan and we’ll tell you what to fix” request can be frustrating for vendors who don’t have a clear goal. Define a policy (such as removal of all flaws in the OWASP Top 10), testing techniques (dynamic, static, manual pen testing, etc) and reasonable timelines (think in terms of months, not days) in which vendors should fix their flaws.
- Understand Market Immaturity, But Drive Maturation – Some of the more mature software vendors have very impressive AppSec practices including developer education, static analysis integrated into the SDLC, routine dynamic and penetration testing and a variety of other activities and are typically very cooperative in fulfilling security requests. While these vendors are great to work with, they’re not the norm. Don’t punish those vendors who have not committed to building security in, but make it clear that they will be expected to do so in the very near future or face potential impacts on your business relationship.
- Disparate Responses are OK (for now) – It’s often the case that security is an afterthought for vendors – they’re typically worried more about the next release, new features and keeping the lights on than providing a secure product. The goal of vendor activities should be twofold:
- Gather information your organization needs to make better informed decisions.
- Drive better practices for vendors going forward.
- Tip: Providing vendors with options to do an initial questionnaire like the vBSIMM will get your team short term self-attested answers and drive them to adopting better long term practices. Being realistic but prescriptive in your requests will help not only get initial responses but also drive longer term adoption of best standard practices.
- Work With Your Peers – Vendors are typically selling to dozens, if not hundreds, of customers. If they have to fulfill a separate security requirement for each of those they’ll quickly become frustrated. Most industries have groups that are beginning to focus on AppSec best practices in terms of vendors (such as FS-ISAC in the financial services community). Investigate these groups and begin discussing how you can work together to drive standard goals for vendors.
- Involve Procurement – Nearly every vendor deliverable must go through a procurement process that includes a variety of requirements. Work with this team to include AppSec requirements in the RFP process and include contract language that requires participation in security analysis and remediation. We make our recommend language available for free at: http://www.veracode.com/services/build-security-criteria-into-contracts.html
Securing third-party applications is becoming an increasingly popular topic in the security community. Regardless of the type of solution, enterprises are realizing they don’t control or have any idea what the security posture is for many of the products they use to run their businesses. It could take several years to drive improved awareness and adoption of secure development practices across the broader vendor community, but if businesses begin by following the above recommendations, they are taking a huge first step in making sure the applications they use aren’t putting their business at risk.
Tomorrow Veracode co-founder and CTO/CISO Chris Wysopal, and Josh Corman co-founder of Rugged Software and Director of Security Intelligence at Akamai Technologies will be filming a video segment with Paul Roberts of The Security Ledger.
The trio will be chatting about a variety of topics trending in the Appsec field including but not limited to; recent changes to the OWASP Top 10, security of third party software components, and industry culture.
The two will also be answering a selection of questions submitted by members of our community. If there’s a question you’d like answered by either Chris or Josh you can submit it in the comments on this post or tweet it either @Veracode or use the hashtag #TalkingCode! Please submit your questions for consideration before 12pm EST on Friday May 17th.
The video will make it’s debut on The Security Ledger in June, we’ll let you know as always once it’s live.
A large-scale survey of IT security professionals found that application security is the most pressing security problem facing them, beating out malicious software and mobile devices, according to a survey released by (ISC)2 and Frost & Sullivan.
The 2013 (ISC)2 Global Information Security Workforce Study ranked application security issues at the top of a list of survey – the same place it occupied in a similar survey in 2011. Application vulnerabilities were listed as a “top” or “high” concern for 69 percent of survey respondents. That’s a slight dip from 2011, when 73% of respondents named that as their top security threat. Malware, including viruses and worms, moved up to the #2 spot, with 67 percent of respondents listing it as a “high” concern or their “top” concern.
Now, if you follow security for any amount of time, you know that there are all kinds of surveys. Vendors survey their customers. Publications survey their readers. Random web sites survey random collections of folks who visit their site. There’s a lot of variability and survey results should always be taken with a grain of salt. That said, the (ISC)2 survey has some weight to it. First of all, it was conducted with the help of professionals (Frost & Sullivan as well as Booz Allen Hamilton), not the Director of Marketing of SecurityStuff Inc. Second, the sample population is large: 12,000 information security professionals.
The survey data concerning application security is revealing. Those surveyed didn’t just say they worried about the threat posed by application vulnerabilities, they also acknowledged that much of the blame lay within their organizations. Forty one percent of those surveyed listed “applications and system development security” as their second most urgent training need, after “information risk management” (the choice of 47% of those surveyed). “Many organizations have come to the realization that their own internally created software suffers from the same security risks as those coming from a vendor.”
When asked what aspects of software development held the most security concerns for them, respondents said it was early stage development that concerned them the most – not QA. Eighty one percent listed software “design” as the development task in need of better security, followed by “Specifying requirements” and “Testing, debugging or validation.”
That probably shouldn’t come as a surprise. IT security professionals are increasingly involved in software development. Fully 22% of those who took the (ISC)2 survey said that they were “personally involved in software development.” Of those respondents in the Americas region, the figure was 24%.
As this blog (and others) have noted on many occasions, more secure application development starts with better training for would-be application developers. That’s especially true as more and more applications make use of shared and open source components that speed development, but often at the price of security.
Chris Wysopal on Tuesday wrote about SAFECode, a program spearheaded by Adobe to offer free application security training to developers. With almost three quarters of applications submitted to Veracode failing to comply with enterprise security standards, any help is appreciated. The (ISC)2 survey is a reminder that, as the ranks of those involved in software development swell, so does the need for education about application security.
Our entire Research team is in town this week for a round table catch up and this fun artist’s rendition of them materialized. Given that I haven’t personally met them all I was unable to identify a few of them by these cartoons. I figured I’d turn to our trusty community to help me out, comment below if you you think you know an avatar’s human counterpart with the number next to them and their full name.
Who is this tablet toting technomancer?
The first to correctly identify any Veracoder will be credited below by name or Twitter handle!
- Isaac Dawson (@_wirepair) by @c0ntr0llerface
- Chris Eng (@chriseng) by @hackerhuntress
- Brandon Creighton (@unsynchronized) by X30n
- Fred Owsley (@fredowsley) by @gwenkrauss
- Ryan O’Boyle (@523) by @c0ntr0llerface
- Bonus: Melissa Elliott (@0xabad1dea) by @hackerhuntress
Filed under: ALL THINGS SECURITY, application security, SDLC, Software Development
A developer’s main goal usually doesn’t include creating flawless, intrusion proof applications. In fact the goal is usually to create a working program as quickly as possible. Programmers aren’t security experts, and perhaps they shouldn’t be. But when 70% of applications failing to company with enterprise security standards (data from Veracode SoSS vol 5), it is clear more attention needs to be given to secure programming techniques.
This is why when I came across an article describing a new training program by the Software Assurance Forum for Excellence in Code (SAFECode), I was pleasantly surprised. The organization, led by Howard Schmidt, will offer training courses for “anyone that does development work”. The first six training courses will focus on web application security flaws such as SQL injections and Cross Site-Scripting.
I haven’t had a chance to view the full curriculum, but I have confidence in the security pros at Adobe, have put together an excellent program. Web application security flaws are some of the easiest flaws to avoid and most exploitable, yet they are also some of the most common flaws, so I think starting program with lessons on web applications is a great first step. It is an extra bonus that the material will be Creative Commons licensed which should allow for wide distribution. The free on demand training courses are available at:
The security industry needs more programs like the training from SAFECode. When combined with integrating security testing and scanning into the software development lifecycle (SDLC), these programs will help create less vulnerable applications and reduce the number of successful attacks using well known vulnerabilities. While it seems like most people agree on these points, the need for speed has somehow made slowing down to consider security during the development process uncool. This is especially true when programmers don’t have as many resources at their disposal, for example, when developing open source applications. It is as if acknowledging that you may have security flaws in your code is the same thing as admitting you aren’t a true programmer. This couldn’t be farther from the truth. Even the smartest, most innovative programmers can create software with flaws because they are human and imperfect, just like the rest of us.
Offering free training courses and materials on secure coding will hopefully serve a dual purpose. My first hope is that it will help programmers use more secure coding practices. The second is that it will eliminate the taboo of admitting (during the development stage) that an application could have security vulnerabilities. Only then can flaws be remediated before the program is released.
Filed under: application security, Binary Analysis, research
Everyone has had that dreaded experience: you open up the task manager on your computer… and there’s a program name you don’t recognize. It gets worse when you google the name and can’t find a concrete answer on what it is and why it’s there. It gets even worse when you remove it from Autoruns and it comes back. It gets terrible when you realize it has keylogger functionality. The icing on the cake, however, is when the mystery program is also eating up all your RAM.
The RAM issue is actually how this special little program on my own computer came to my attention. I recently bought a high-end Windows 8 tablet – to protect the guilty, we’ll call the manufacturer “Spacer”. Like most Windows computers, it came with an assortment of apps preinstalled by “Spacer”, ranging from the mildly useful to trash you delete without hesitation. In particular, I liked the interface that popped up when I plugged into HDMI, so I didn’t go on a vendor utility murdering spree.
I happened to have Resource Monitor open, and I noticed that the second-most RAM-hungry program was… a “Spacer” background service with a generic name, consuming 280MB. Not bad for a 15KB binary! Googling the name, “MEMS Enhancement Utility”, only turned up other customers wondering what it was and observing that getting rid of it didn’t seem to break anything. I disabled it and rebooted, but it came back. Presumably, one of the “Spacer” apps was serving as a watchdog for the others. The easy solution is to simply get rid of the program all together, but I decided to investigate what made this program so important in the first place.
Figure 1: Not the most clarifying metadata
It turns out that the program was written in .NET, which is vastly easier and faster to reverse-engineer than conventional native binaries. At Veracode, we have our own internal tools for automated analysis of .NET programs, but for interactive purposes, I recommend the free JetBrains dotPeek.
When starting an investigation of a binary, I like to take a quick tour of bundled functionality.
Figure 2: Imported Namespaces
Aside from the typical imports, Windows7.Sensors is a fairly self-explanatory name, and is in fact just a sample code kit off MSDN for reading the tablet’s accelerometer. That’s interesting but rather benign functionality. Far more… concerning is the member variables and methods of the “gma” namespace.
Figure 3: Consider my eyebrows raised
This is, of course, the classic sign of a userspace keylogger, but for every keylogger out there, there’s a hundred legitimate apps who hook the keyboard and mouse for perfectly sensible reasons; otherwise, why would it even be in the standard Windows API? I was, however, beginning to question the provenance of this application.
The actual logic of the utility, however, was… puzzlingly brief. It initiated a nearly-empty form and hid it. It set up handlers to receive keyboard, mouse, and accelerometer activity. It then set up timers to poll the accelerometer based on how long it’d been since the last keyboard or mouse activity. And that’s it. That’s all. The application does not store or transmit or even display the information polled. It does nothing. I spent the better part of two hours scouring the obscure corners of the binary, thinking surely I must be missing some cleverly hidden method that actually uses this data. I couldn’t find one.
Putting aside this issue for now, I couldn’t help but think: why in the world is this little app ballooning into hundreds of megabytes of RAM? That’s usually the sign of a runaway memory leak, but in a pure .NET application, such things are actually difficult to cause, whereas in a C/C++ program they’re very difficult not to cause.
The answer lies in the fact that the sensor-reading DLL uses marshaling to interact with native APIs, and actually calls traditional memory allocation routines. Hence, every time the accelerometer is polled, manual allocations are made that may or may not ever be manually freed depending on control flow.
Figure 4: One of the places where the sensor glue code indulges in manual memory management
Violently shaking the tablet (is this thing still under warranty?) causes the RAM usage of “MEMS Enhancement Utility” to spike, but not all at once, as the accelerometer reader is going off at timed intervals rather than constantly. The memory usage will balloon by several megabytes every time I shake the tablet, and after a few minutes, some of it will be reclaimed by the garbage collector but some will not. Hence, the base RAM usage of the process steadily creeps upwards.
So now we know what the program does and why its memory usage is so high, but that still leaves the question of why it’s doing this at all. The clues are there, vestigial remnants of removed code, exciting to any Executable Archaeologist:
Figure 5: Declared variables of the main form
The generically named “Form1” of the application contains several widgets which are never actually displayed: a start button, a stop button, a place for displaying mouse coordinates, and a text box for displaying some other unspecified data. I believe this was originally a debugging utility used by “Spacer” engineers to calibrate the accelerometer so that it would not go off when one simply tapped on the touchscreen (triggering a mouse event or keyboard event). They didn’t bother to rigorously prevent memory leaks because it was never intended to run for more than a few minutes at a time. Somehow, through some miscommunication, a copy of this program with the logic for rendering the visuals stripped ended up on the list of utilities that needed to be kept in the final version of “Spacer’s” Windows 8 image for this model of tablet. Someone then dutifully registered it to be launched in the background every time the tablet boots, and every time the tablet is tilted, shaken, or prodded a little too hard, the RAM usage goes up.
Never attribute to malware what can be adequately explained by a few lines of debugging code somebody forgot to disable.