Thursday, May 10, 2018

ICMC18: Panel Discussion: Technology Challenges in CM Validation

Panel Discussion: Technology Challenges in CM Validation (G21b) Moderator: Nithya Rachamadugu, Director, CygnaCom, United States Panelists: Tomas Mraz, Senior SW Engineer, Red Hat, Czech Republic; Steven Schmalz, Principal Systems Engineer, RSA—the Security Division of EMC, United States; Fangyu Zheng, Institute of Information Engineering, CAS, China

All three panelists have been through their share of validations, and Fangyu has also had to deal with the Chinese CMVP process.

As to their biggest challenges, everyone agrees that time is the issue. Tomas noted that it's very difficult to get the open source community excited about this and doing work to support the validations. For Fangyu, they often have to maintain 2 versions of several algorithms, one for US validations and one for Chinese.

In general, it's hard to find out what the requirements are from most customers here, particularly across various geos.

Several panelists agree this is seen as an expensive checklist. Steven also worries about the impact on business - it goes beyond what you pay the labs and the engineers to write the code. It's hard to get this done and get it to all the customers. Tomas noted that there are conflicting requirements between FIPS and other standards (like AES GCM, though that has been recently addressed).

On value of the certification, have you found anything during a validation that made your  product more secure? Steven notes you can talk about methodologies for preventing software vulnerabilities, and the devs will come back and say why didn't it work for "so-and-so"? But if you look specifically at the testing of the algorithm, it gives you value that you've implemented the cryptography correctly.  Not clear we get as much benefit out of the module verification.  Agreement across the panel that CAVP is valuable, technically.

Steve really wishes there was a lot more guidance on timing of validations and how to handle vulnerabilities. Tomas notes it's hard to limit changes to the boundary in the kernel, because we need to add new hardware support and other things. Fangyu noted that even rolling out fixes for a hardware module is challenging.

All panelists are excited about the automation that is happening, though Steven is wondering if it will really be possible for the module (algorithm testing seems very automatable, and that will still help).  Steven talked about industry trend to continuously check status of machines, make sure they are up to date with patches, etc - getting this automation in can help people continuously check their work, even on development modules.

All panelists noted that the validation process could be improved, but it won't help the overall security of the system.

Customers want Common Criteria and FIPS 140-2, but don't really understand what it means, they just want to make sure it's there. Try to do them both at the same time is difficult to line up the validations and making sure all teams understand when they need it. And... they both still take too long to get.

On the topic of out of date or sunset modules - it's unclear how many customers may be running these, but Steven has heard support requests come in for out of date modules. They use that as an opportunity to get them to upgrade. Tomas noted they won't likely be able to "revive" the sunset module, due to how quickly the IG and standards change.

No comments:

Post a Comment