Thursday, May 19, 2016

ICMC16: LibreSSL

Giovanni Bechis, Owner, System Administrator and Developer, SnB, Developer, OpenBSD

The LibreSSL project started in April 2014, after heartbleed was discovered in OpenSSL. Man vendors and systems were impacted by this, and there is no way to tell if you have been attacked or not.  Why did this happen? The OpenSSL code was too complex.  Thought - should we try to fix OpenSSL or fork?

Fork was decided because the fork was too complex and intricate. This has changed more recently, but in April 2014 the OpenSSL developers were only interested in new features, not in bug fixing. Heartbleed wasn't the only reason we decided to fork, it was just that the code was too complex. For example, OpenSSL doesn't use malloc, and the allocator it does use doesn't free memory.  It uses LIFO recycling.  The debugging features in their malloc are useful for debugging but could be used attack.

At the time, pretty much all OpenSSL API headers are public. Many application developers were using interfaces they should not have been exposed to. It uses it's own functions, instead of things provided by libc, etc

There is a lot of #ifdef preprocessing code to work around bugs in compilers or on specific systems.

Forked from OpenSSL 1.0.1g. Have been backporting bug fixes from that tree.

OpenSSL is the "de facto" standard and widely used. It is difficult to get patches applied upstream. They wanted to preserve the API to maintain compatibility with OpenSSL.

They want to make sure they use good coding practices and fix bugs as fast as possible.

No FIPS support, mainly because their developers are not in the US.  They have removed some old ciphers (MD2, etc) and add ChaCha20 and Poly1305.

Removed SSLv3 support. Removed dynamic engine support, mostly because there were no engines for OpenBSD so they could not test.

OpenSSL is portable, but at the expense of needing to reimplement things that are found in most implementations of libc and lots of #ifdef and #ifndef.

Some of the OpenBSD software has been switched to use libressl, like the FTP client software. 

ICMC16: The OpenSSL 1.1 Audit

Kenneth White

There is an Open Crypto Audit Project. Originally formed to do an audit of the TrueCrypt audit.  Currently seeking non-profit status.  More recently looking ath OpenSSL.  Why? It's everywhere!

OpenSSL 1.0.2 FIPS is in over a 100 validations.

Enterprise people often say they don't care about FOSS, don't realize it's deployed very widely in their enterprise!  Like Cisco VPN client.

The audit of OpenSSL was commisioned by the Linux Foundation.  A pretty ambitious scope.

Most of the code in OpenSSL is written in C (70%), and currently has about 8 million lines of code. (that's a lot to audit!)

FIrst look at BigNum, BIO (focus on composition and file functions), ASN.1 and x.509 and 93M cert corpus, and "frankencert" fuzzing.

Next phase will cover the TLS state machine, EVP, protocol flows and core engine implementation, memory management and crypto core (RSA, SHA2, DH/ECDEH, CBC, GGM).

Need to focus on most relevant platforms and algorithms and protocols.

Preliminary findings:
Complexity led to some potential bugs invalidated due to pre- or post- target parsing.  PEM parsing contained unexpected formats including access to ASN.1 decoding facilities, HMAC and CMAC algorithms. Memory leak and integer overflow identified, but very unlikely invalid or low severity issues.  RSA uses blinding and constant time operations by default.

From the fuzzing work, found found 280 certificates that had very bizarre dependencies that resulted in diverse paths being taken.  The fuzz testing for x.509 parsing did not result in any crashes.  Did find bugs with some DER fuzzing, related to performance, but the right things seemed to happen.

Still looking at low impact, low likelihood, low severity potential vulnerabilities, but overall the code is looking very solid.

As Poly1305 and CHaCha20 were added recently, we'd like to take another look.

ICMC16: OpenSSL Update

Tim Hudson, Cryptsoft

The OpenSSL team had their first face to face meeting, ever! 11 of the 15 members got together digging into documentation and fixing bugs - and POODLE broke, so... they got to know each other quite well.

The team thinks of time as "before" April 2014 and after... Before there were only 2 main developers, entirely on volunteer basis.  Nor formal decision making process, extremely resource limited.  After April 2014, now have 2 full time developers and 5-6 regular developers.  This really helps the project.

After a wake-up call, you have to be more focused on what you should be doing. Everyone is now analyzing your code-base, looking for the next heartbleed.  Now there is more focus on fuzz testing, increased automated testing. Static code analysis tools are rapidly being updated to detect heartbleed and things like heartbleed.

New: mandatory code review for every single changeset.  [wow!]

The OpenSSL team now has a roadmap, and they are reporting updates against them.  They want to make sure they continue to be "cryptography for the real world" and not ignore "annoying" user base for legitimate concerns or feature needs.

Version 1.0.2 will be supported until 2019.  No longer will all releases be supported for essentially eternity.

OpenSSL now changing defaults for applications, larger key sizes. removing platform code for platforms that are long gone from the field.  Adding new algorithms, like ChaCha20 and Poly1305. The entire TLS state machine has been rewritten.  Big change: internal data structures are going to be opaque, which will allow maintainers to make fixes more easily.

FIPS 140 related work paid for OpenSSL development through 2014.  It is hard to go through a validation with one specific vendor, who will have their own agenda. 

There are 244 other modules that reference OpenSSL in their validations, and another 50 that leverage it but do not list it on their boundary.

The OpenSSL FIPS 2.0 module works with OpenSSL-1.0.x. There is no funding or plans for an OpenSSL FIPS module to work with OpenSSL-1.1.x.

The hopes are that the FIPS 3.0 module will be less intrusive.

If you look at all the OpenSSL related validations, you'll see they cover 174 different operating environments.

How can you help? Download the pre-release versions and build your applications. Join the openssl-dev and/or openssl-users mailing lists. report bugs and submit features 

If FIPS is essential to have in the code base, why hasn't anyone stepped forward with funding?


Wednesday, May 18, 2016

ICMC16: Secure Access with Open Source Authentication

Donald Malloy, LSExperts

For years, Mastercard and Visa said the cost to implement the chips in cards cost more than the fraud, but that has all changed in recent years.  EMV (Chip and Pin) is being rolled out in the US, which will push the US fraud to online (can't use the chip).

Fingerprints are non-revocable. Someone can get them from a picture or hacking into a database.

OATH is an industry consortium, the algorithms are free to use.

They have started a certification program, so we can verify that vendor tokens work together.

Why is OTP still expensive? Comes in soft tokens, hard tokens, usb tokens, etc.  Cost per user has consistently been too high, manufacturers continue to have a business model that overcharges the users. OATH is giving away the  protocols - so why still so much?

working with LinOTP - fast and free.

Since biometrics are irrevocable, how do we get stronger passwords? Could we use behaviour analytics? type a phrase and the computer will know it's you.

 




ICMC16: The Current Status and Entropy Estimation Methodology in Korean CMVP

Yongjin Yeom, Kookmin University;Seog Chung Seo, National Security Research Institute

NSR is the only company testing in Korea.

We have vendors, testing authority and certification authority. NSR tests according to Korean standards. NSR develops and standardized testing methodology with vendors.

KCMVP started in 2005, using their own standard. Staring in 2007, leveraged international standards.

116 modules have been validated. There are more software validations than hardware validations. Most of the software validations have been in C and Java.

The process overview of KCMVP looks very similar to CAVP combined with CMVP in the US/Canadian schemes. There are algorithm verification, tests of the self tests, etc ;)

They have a CMVP tool to automatically verify algorithm correctness.

Just like here, there are big concerns about entropy sources. It's hard to get entropy to scale, so they want to discover the limits of entropy source.

ICMC16: Introduction on the Commercial Cryptography Scheme in China

Di Li, Senior Consultant, atsec information security corporation

Please note: Di Li works for atsec, and is not speaking for the Chinese Government.

Their focus is on certifying both hardware and software, including HSMs and smart cards.  In China, only certified products can be sold or used. They should be sold commercially. By law, no foreign encryption products can be sold or used in China.

Additionally use their own algorithms. Some are classified, others leverage international algorithms like ECC, others would be considered a competitor to algorithms like SHA2 or AES-GCM.

They have their own security requirements for cryptographic modules, but there are no IGs, DTRs, etc., so it is different. No such concept of "hybrid", either.

There are two roles in the scheme: OSCCA and vendor. OSCCA issues the certificates and they are also the testing lab.  The vendor designs and develops the product, sell, and promote.  They report their sales figures to OSCCA every year.

There are requirements to be a vendor as well! To get sales permission, you need to demonstrate you are knowledgeable in cryptography among other things. 

Validations are updated every 5 years. As a part of this process, you additionally have to pass design review.

You cannot implement cryptography within a common chip , as there is not enough security in that chip.

Banking must use a certified products. The biggest market is USB tokens and smart cards.

ICMC16: Automated Run-time Validation for Cryptographic Modules

Apostol Vassilev, Technical Director, Research Lead–STVM, Computer Security Division, NIST
Barry Fussel, Senior Software Engineer, Cisco Systems

Take a look at Verizon's Breach report. It's been going on for 9 years, and we see things are not getting better. It takes attackers only minutes to subvert your security and they are financially motivated.  Most of the industry doesn't even have mature patching process.

We hope validation can help here, but vendors complain it takes too long, the testing depth is too shallow, and it is too costly and rigid.

In the past, we leveraged code reviews to make sure code was correctly implemented. Data shows that code reviews, though,  only find 50% of the bugs.  Better than 0, but not complete. Can we do better?

Speeding up the process to validate, allows people to validate more products.

Patching is important - but the dilemma for the industry is "do I fix my CVE or maintain my validation?" This is not a choice we want our vendors to make.  We should have some ability for people to patch and still maintain validation.

The conformance program needs to be objective, yet we have different labs, companies and validators relying on these written test reports. This is a very complex essay!  Reading the results becomes dizzying.  We want to improve our consistency and objectivity, how can we do this?  So, we asked the industry for feedback on what the industry looks like and how we could improve the program. We started the CMVP working group.

The problem is not just technical, but also economical. To make changes, you have to address all of the problems.

Additionally, we need to reconcile our (NIST's) needs with NIAP's (for common criteria). 

And a new problem - how to validate in the cloud? :)

The largest membership is in the Software Module sub group - whether that is due to the current FIPS 140-2 handling software so poorly, or reflective of a changing marketplace is unclear.

Fussell took us into a deep dive of the automated run-time validation protocol. 

The tool started out as an internal CIsco project under the oversite of David McGrew, but didn't get a lot of traction. Found a home in the CMVP working group.

One of the primary requirements of this protocol is that it be an open standard.

Case example: Cisco has a crypto library that is used in many different products. First they verify that the library is implemented correctly, but then they need a way to verify that when products integrate it they do it correctly.

The protocol is a light weight standard's based protocol (HTTPS/TLS/JSON). No need to reinvent the wheel.

Authentication will be configurable - multi factor authentication should be an option, but for some deployments you won't need a heavy authentication model.

And Barry had a video! (hope to have linked soon)

If things are going to be as cool as the video promises, you'll be able to get a certificate quickly -from a computer. :)  And if you find a new