Wednesday, December 28, 2016

PHPMailer vulnerability

I blogged yesterday about the release of the PHPMailer vulnerability CVE 2016-10033 and how it was unlikely to be exploited in a default release of Joomla.  Now there's a POC released, but I still haven't changed my position on this.

I'm sure that there are vulnerable applications out there. I also always recommend that people patch as soon as possible when patches are available (pending testing). But this one seems over hyped to me. Joomla! includes PHPMailer as a library, but doesn't use it in any way that allows for exploitation. SugarCRM uses PHPMailer, but it isn't immediately clear to me whether it is used in a way that allows the vulnerability to be triggered. Again, you should patch, but don't burn down the house to do it unless you know you are vulnerable.

As an aside, the default POC script (which every skiddie out there will use without modification) uses the string "zXJpHSq4mNy35tHe" as a content boundary. You can use this for your IDS to find attackers on the wire using the default POC script.

Most of this content was cross posted from my Peerlyst page.

Tuesday, December 27, 2016

New Joomla vulnerability - TL;DR you're probably okay

There's a new vulnerability in the core Joomla distribution, this time in the PHPMailer plugin.  Successful exploitation results in remote code execution (RCE) and normally I'd be shouting "patch now" from the rooftops.  But in this case, you're probably okay.

The vulnerability is in the "From" email parameter.  The core distribution only uses an API that does not allow the "From" email to be modified.  Joomla advises some other plugins may use the PHPMailer plugin in ways that allow the "From" address to be modified in ways that might result in RCE.  However they stop short of specifying any plugins that are vulnerable.  Do you know of any plugins vulnerable?  Hit me up in the comments.

Friday, December 23, 2016

Rejects v1

As many of you know, I regularly contribute to SANS NewsBites.  It's an outstanding email newsletter that normally is published twice weekly.  Not everything I contribute gets published though.  Sometimes things get chopped by the editors.  I have a pretty good idea of what doesn't follow SANS' editorial guidelines and try not to contribute those thoughts.  For the rest of it though, I decided I'm letting a lot of good content I've already written go to waste and decided I'd start publishing them here under the heading "rejects."  This blog series is not affiliated with SANS in any way and does not reflect their views.  Also, I am not in any way knocking NewsBites for not publishing everything I send in.  It's a tremendously valuable newsletter - one that I used myself throughout the years and I'm honored to be a contributor now.

Regarding a story about how the number of claims against cyber insurance are on the rise:
In my practice, I work with a number of organizations that have great confusion about what is an isn't covered by their cyber insurance policies.  Don't assume anything here, the stakes are far too high.  I always recommend organizations perform tabletop exercises to determine if their coverage would be sufficient for events reported in the media and adjust their risk models (and perhaps coverage) to suit.
Regarding a story about how the US military "was almost brought to its knees" by Russian hackers:
The media has blown this out of proportion, saying that it could "bring the US military to its knees."  Those who understand the intel gain/loss model know that no such action is likely.  Russia could use this access to continue to gather information indefinitely until detected or perform a very temporary disruptive event.  Attackers most often have far more capability than they exercise during an intrusion.
That's all I have for this week.  Hopefully this adds value in some way.

Thursday, December 22, 2016

South Carolina wants porn filters installed on new computers

I so wish this was a joke.  Unfortunately, it's serious.  Take a minute and read the article.

This is a great example of legislators wanting to do something positive, but doing something very negative instead.  The desire here is to limit human trafficking (a noble goal) but the method is through porn filters on computers sold in the state.  There are so many things wrong with this, I don't even know where to begin.  Obviously there's no causal relationship to speak of.  Porn doesn't cause human trafficking or vice versa. So there's that important tidbit.

Then there's the Constitutional aspects of the proposed legislation.  It's unlikely this law would ever survive a Constitutional challenge, and if that's the case then passing it just takes resources away from the state (resources used in a likely futile attempt to uphold the legislation in court could better be spent elsewhere).

But my real concern is that the impact to computer security would likely be significantly negative.  To be effective, the porn filters would have to integrate with browsers and would be unlikely to meet the same security standards as other software.  Then there's the issue of telemetry and big brother, securely updating block lists, etc. Further, the proposal allows end users to pay to remove the porn filter. I can already see the underground "free porn filter remover" economy popping up, similar to illicit keygen programs, most laced with malware.

I have no love of porn (I see way too much of it in forensics cases) and don't live in the state of SC.  But I bring these thoughts forward because far too often we experience a disconnect between intent and reality in infosec, particularly when legislators get involved.  Take time over the holidays to educate your family members that many pleas of "save the children" or "stop human trafficking" have negative implications for infosec.

Tuesday, December 20, 2016

Encryption of healthcare SAN/NAS

I ran this poll a couple of weeks ago on Twitter.  I was looking to back up a theory of mine with some data, however bad my sample set is (people who follow me on Twitter).  In the end, I got some data, but I'm not sure how valid it is.  


The problem with this poll is that even though it got 53 replies (which I'm super thankful for), I don't know how many of these respondents really work in healthcare.  People also have a tendency to tell you what they think you want to hear.  I think that's going on here too.  People know that HIPAA requires encryption for data in transit and portable devices.  I think they are extending that to the SAN/NAS example here.

I can't imagine many likely scenarios where you would invest money in a SAN/NAS (where performance is key) and then lose performance (money) on disk encryption.  Full disk encryption protects primarily against physical attacks and your SAN/NAS should be in a secure environment.

This was cross posted from my Peerlyst account.  I'm really interested in people's perspectives on this, but I've had to largely disable comments in the blog due to blog spam.  If you have something to contribute, hop on over to Peerlyst and comment there. I'm really interested in perspectives on this issue.

Saturday, December 17, 2016

Infosec reporting and the problem of reaching your audience

If you've ever taken a course with me at SANS you know how big I am on reporting and getting that right.  You can be the best in the world at the technical aspects of infosec, but none of that matters if you can't write.  I regularly tell people to shoot for maximum 7th grade reading level in their executive summaries.  Your executives aren't stupid (most of them) but you shouldn't make your writing hard to read or you're less likely to get engagement.  This great article I found doesn't cover writing for infosec explicitly, but really hammers home how many people read at below a 9th grade level.

Read and heed. The people in infosec making the most cash aren't always the smartest or the most technical. They're the ones that can communicate effectively - and that invariably involves writing coherently.

This was cross posted from my #Peerlyst account.  If you haven't yet joined the Peerlyst community, I think it's a great source of knowledge for the community. Go sign up.

Tuesday, December 13, 2016

Bad correlations in IR? Maybe no reverse engineers is the problem?

Correlation isn't the same thing as causation.  Forensics professionals often seem to forget that when they deal with incident data.  Just because an event occurred and malware was found on a machine that could have caused the event doesn't mean the malware caused the event.  Is there a correlation? Sure.  Is this enough to establish causation, nope.

I semi-regularly tweet images of spurious correlations to remind my fellow DFIR brethren that correlation is not the same as causation.  These are so ridiculous that they paint a powerful picture that correlation and causation could not be the same thing.

But why do we ever assert that correlation and causation are the same?  I think the root of this is a lack of knowledge.  This in turn leads to logical fallacies in our thinking.  We can fix this correlation/causation confusion by learning more about incident response.  How do we get more data? What data is actually useful in investigating the incident?  If I wanted to find out more, where should I look?  What does normal look like in my environment?

One of my biggest suggestions for overcoming these issues in IR is to make sure you have access to a reverse engineer.  Using black magic and ritual sacrifices*, reverse engineers can help alleviate confusion about what capabilities a piece of malware actually has.  I frequently read in reports that "the malware is using code injection."  Why, I ask?  "Because we checked for normal persistence and didn't find anything."  This is obviously not a strong connection.  In fact, it's REALLY weak.  Absence of one thing is not proof of another.  Period.
* Often, reverse engineering involves neither of these things.

Reverse engineers can also benefit organizations by helping to fully decode malware C2.  I can't tell you the number of reportable HIPAA incidents I've seen staved off by knowing specifically what an attacker did (and exfiltrated) from a network.  Full packet capture is great, but it can't be fully used without good C2 analysis (which requires a reverse engineer).

I'll close my soapbox about reverse engineers and say that having a reverse engineer available is a game changer for many organizations.  Reverse engineering answers questions that no other tool or capability can.  If your team is growing and doesn't yet have enough work for a full time reverse engineer, you can always put one on retainer from a reputable firm.  If you need help, talk to Rendition Infosec, we have reverse engineers that would be happy to maximize your return on investment and change "we think this is what happened" to "we know this is what happened."

Monday, December 12, 2016

Disqualifying votes in Michigan - how would this play in PA?

I'm a little late on this, but figured I'd discuss the issue anyway.  I read a story that says that many of Clinton's votes in Michigan may be disqualified from the recount due to problems with the voting machines in precincts that heavily favor her.  The issue has to do with reconciling vote tallies with voter sign in logs.  The discrepancies in reconciliation have to do with old voting machines that may be faulty.

Interestingly, we only know of this because of the paper record that is generated in Michigan.  But Pennsylvania uses pure electronic voting.  How would this play there?  As I ponder the idea of auditing an e-voting machine, what would happen if malware were found on the machine?  Would you have to disqualify all of the votes?  Since most machines are air-gapped, what if malware was found on the machine that programs or reads the PCMCIA cards for the e-voting machines?  Do you disqualify all of the votes for the machines the infected computer came in contact with?

Yes, malware could technically change the paper backup used in many states too (as Cylance showed), but I'm more concerned about the Pennsylvania case since that's potentially going to be an issue sooner than later.

If finding malware on a machine invalidates votes, then the smartest way to hack an election is perhaps to compromise machines in the precincts where your opponent is heavily favored then trigger an audit.  I'm not recommending this, just suggesting it's the logical conclusion to a strategy of removing votes for malware. 

I don't have all the answers and I'm not trying to start trouble.  But I would urge you to contact your state legislator and ask them how your state will handle issues of malware found on voting machines or those used to tally votes.  If they don't have an answer, suggest that they sponsor legislation.  Practically any legislation on the matter is better than the court battles that will inevitably occur in a legislative vacuum.

Update: a US District Judge just ruled that a recount cannot be held in PA, saying it cannot be completed before votes must be certified. Judge also says it "borders on the irrational" to suspect hacking occurred in Pennsylvania.

Saturday, December 10, 2016

I'm a failure - (mis)adventures in CFP submissions

I love speaking at security conferences.  A good conference presentation goes beyond just sharing your data.  It's a true performance art. Edutainment if you will.  I've been a technical reviewer for submissions at a number of conferences as well.  I always submit to a CFP as though I were a reviewer thinking "is this a presentation I would like to see myself?"  If the answer is no, I don't submit it.

That being said, I'm always a little put out when I get rejected for a conference.  A bunch of reviewers looked at my work. My idea. My baby. And having judged it, they found it lacking. No matter how many times I've been through it or how I just know it will be different this time, I'm always put out.  Sometimes I have a feeling of impostor syndrome.  I always find myself wondering "why didn't they like me" or "why wasn't I good enough?"  Sometimes I think that the reviewers know a bunch of people have presented on this topic before me - they think I'm a fraud.... Thoughts (self destructive thoughts) like these happen every. single. time.

But then I quickly remember something I saw on an old "No Fear" tee shirt years ago:
100% of people who don't run the race never win
This is when I remember that I have to put myself out there to win.  I personally submit several proposals for every one that is accepted.  Sometimes when I get rejected from one conference, I submit the same paper to another conference with no edits and it gets accepted. Sometimes reviewers are helpful (like at DEFCON, thanks Nikita) with providing great feedback and I am able to modify my submissions to be better for the next conference.

When you submit to a CFP and aren't accepted, I think it's important to let others know that you've submitted, but were ultimately rejected.  I think this does two important things:

  1. It lets others know you are at least trying to give back to the community.
  2. It lets others know they are not alone in being rejected.

Most conferences receive twice the number of submissions they can accommodate, some receive even more than this.  They have to reject someone. In fact they have to reject a lot of someones.  But don't let this discourage you.  Keep submitting, keep polishing the submission, and most of all don't fear failure.  It's totally natural to feel bad with a rejection notice, but you have to brush yourself off and get back up again.

Why am I writing this now? I submitted two papers to Shmoocon this year, both before the early decision cutoff.  When the early decision came and went without me on the list, I felt bad about myself.  Then I got out of my slump and figured maybe I'd be accepted in the second round. Yesterday, I got two notifications.  The first said I was accepted to speak.  I was elated.  The second email ten minutes later said I was rejected.  I was totally deflated and wondered "what's wrong with me?"  Truth be told, if I never expected to be picked up for both talks.  I'm honestly happy I got accepted for one talk at all.  This will be my third time speaking at Shmoocon and it's an awesome conference.  But they didn't like my second talk.  They don't like my ideas. I'm a failure. A drink or two later I was celebrating being accepted for one talk and the pain from the rejection felt long gone.  If I'd been rejected for both, the sting likely would still be stronger.

Update: Someone reached out to me and said I should be happy one paper got accepted. I am. He said I should be grateful that I wasn't as he put it "totally rejected."  Again, I am.  For the record, I submitted three to RSA this year with a 100% rejection rate.  My point in explaining this was to note that you can feel rejection even when you've been accepted.  Again, if this helps you - great.  If it doesn't, then just forget I wrote it.

These destructive thought patterns are far too easy for us all to fall into.  I'm not writing this for your sympathy, I'm hoping that others can read this and realize "I am not alone - this is something that others go through."  If that's not you, I envy you and the control you exert over your emotions.  For the rest of you: you are normal, these thoughts are normal, don't give up, don't stop submitting, give back to your community.

I'll close by saying that I think security conferences are very important and so is speaking at them. My company Rendition Infosec sponsored several conferences this year and will continue to in 2017.  I also strongly encourage my employees (okay, it's technically coercion) to submit and speak at conferences.  Three members of the Rendition team (Edward McCabeBrandon McCrillis, and Michael Banks) spoke at multiple infosec conferences this year.  I try to coach them through the submission process to maximize their acceptance rates, but I suspect I'm putting them in a bad emotional state when they are rejected. For that, let me formally apologize.

Tuesday, December 6, 2016

New Linux privilege escalation vulnerability

There's a new Linux privilege escalation vulnerability (CVE-2016-8655) that will allow normal users to elevate to root. The bug is in the networking subsystem and relies on the attacker being able to create a raw socket with CAP_NET_RAW. In most Linux distributions, users can't do this unless unprivileged namespaces are enabled.

Red Hat notes that RHEL 5 and RHEL 6 are not impacted by the bug. RHEL 7 is, but not in it's default configuration since unprivileged namespaces are not enabled.

Multiple versions of Debian are listed as vulnerable.

There are also many Ubuntu builds that are vulnerable.

The researcher who found the bug (Philip Pettersson) notes that he discovered the bug by examining areas where memory is allocated in unprivileged namespaces.  Since these are a relatively new development in Linux, it might be that there are locations where developers didn't account for untrusted users having access to manipulate certain kernel structures.  Other such issues may exist in other areas of the code.

At Rendition Infosec we always recommend that clients minimize their exposure by applying the latest operating systems and software patches.  This bug also demonstrates another principle that we try to drive home with our clients: minimize your attack surface.  If you don't need it, don't enable it.  Minimizing attack surface is what keeps Red Hat 7 from being vulnerable in a default configuration.