Page 1 of 1

ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Sat Aug 31, 2020 6:29 pm
by Radicali
{this thread has been split off from a more general discussion of the cryptostorm production rollout schedule, so it can be more accessible to folks with a particular interest in discussing crypto ops & cryptographic engineering in real-world use-case scenarios ~pj}

I read an article not long ago about some of the latest encryptions. The eliptic curve had constants that couldnt be accounted for. The article suggested it looked like a back door. I'm rather novice, so i may not have understood the whole article, but it seemed pretty clear about that part.

Re: Production stats: updates on "go live" of cryptostorm

Posted: Tue Sep 03, 2020 9:34 am
by Guest
Radicali wrote:I read an article not long ago about some of the latest encryptions. The eliptic curve had constants that couldnt be accounted for. The article suggested it looked like a back door. I'm rather novice, so i may not have understood the whole article, but it seemed pretty clear about that part.

I believe you're referring to the timing resistant elliptic curve backdoor in openssl RSA key generation?

An Elliptic Curve Asymmetric Backdoor in OpenSSL RSA Key Generation
Adam L. Young, Moti M. Yung

In this chapter we present an experimental implementation of an asymmetric backdoor in RSA key generation. The implementation is written in ANSI C. We codified what it means for an asymmetric backdoor to be secure (for the designer) in our definition of a secretly embedded trapdoor with universal protection (SETUP). The main properties of a SETUP are: (1) the complete code for the backdoor does not enable anyone except the designer to use the backdoor, and (2) the key pairs that are output by the backdoor RSA key generator appear to all probabilistic polynomial time algorithms like normal (no backdoor) RSA key pairs. We introduced the notion of a SETUP at Crypto '96 (15) and there has been significant advances in the area since then. This chapter and the corresponding appendix constitutes Fundamental Research in cryptovirology and expands on our elliptic curve backdoor in RSA key generation that we presented at the Selected Areas in Cryptography conference in 2005. In particular, the design employs several algorithmic improvements that enable the key generator to run faster. This chapter provides a walk-through of the experimental implementation. The backdoor is based on OpenSSL and the code for it appears in the appendix that is associated with this chapter. For over 10 years we have advocated that the industry change the way RSA keys are generated. We devised and presented heuristic methods that completely foil this entire class of backdoors in RSA key generation (15, 12). The approach in (12) is reminiscent of the NIST FIPS 186-2 DSA parameter generation method.

Please obtain the latest version directly from the ocial Cryptovirology Labs website at: | Published in 2006.

(162.77 KiB) Downloaded 430 times

{edited to add .pdf & abstract ~pj}

The Strange Story of Dual_EC_DRBG (Schneier)

Posted: Thu Sep 05, 2020 6:16 pm
by Pattern_Juggled
The Strange Story of Dual_EC_DRBG
November 15, 2020 | Bruce Schneier

Random numbers are critical for cryptography: for encryption keys, random authentication challenges, initialization vectors, nonces, key-agreement schemes, generating prime numbers and so on. Break the random-number generator, and most of the time you break the entire security system. Which is why you should worry about a new random-number standard that includes an algorithm that is slow, badly designed and just might contain a backdoor for the National Security Agency.

Generating random numbers isn't easy, and researchers have discovered lots of problems and attacks over the years. A recent paper found a flaw in the Windows 2000 random-number generator. Another paper found flaws in the Linux random-number generator. Back in 1996, an early version of SSL was broken because of flaws in its random-number generator. With John Kelsey and Niels Ferguson in 1999, I co-authored Yarrow, a random-number generator based on our own cryptanalysis work. I improved this design four years later -- and renamed it Fortuna -- in the book Practical Cryptography, which I co-authored with Ferguson.

The U.S. government released a new official standard for random-number generators this year, and it will likely be followed by software and hardware developers around the world. Called NIST Special Publication 800-90 (.pdf), the 130-page document contains four different approved techniques, called DRBGs, or "Deterministic Random Bit Generators." All four are based on existing cryptographic primitives. One is based on hash functions, one on HMAC, one on block ciphers and one on elliptic curves. It's smart cryptographic design to use only a few well-trusted cryptographic primitives, so building a random-number generator out of existing parts is a good thing.

But one of those generators -- the one based on elliptic curves -- is not like the others. Called Dual_EC_DRBG, not only is it a mouthful to say, it's also three orders of magnitude slower than its peers. It's in the standard only because it's been championed by the NSA, which first proposed it years ago in a related standardization project at the American National Standards Institute.

The NSA has always been intimately involved in U.S. cryptography standards -- it is, after all, expert in making and breaking secret codes. So the agency's participation in the NIST (the U.S. Commerce Department's National Institute of Standards and Technology) standard is not sinister in itself. It's only when you look under the hood at the NSA's contribution that questions arise.

Problems with Dual_EC_DRBG were first described in early 2006. The math is complicated, but the general point is that the random numbers it produces have a small bias. The problem isn't large enough to make the algorithm unusable -- and Appendix E of the NIST standard describes an optional work-around to avoid the issue -- but it's cause for concern. Cryptographers are a conservative bunch: We don't like to use algorithms that have even a whiff of a problem.

But today there's an even bigger stink brewing around Dual_EC_DRBG. In an informal presentation (.pdf) at the CRYPTO 2007 conference in August, Dan Shumow and Niels Ferguson showed that the algorithm contains a weakness that can only be described as a backdoor.

This is how it works: There are a bunch of constants -- fixed numbers -- in the standard used to define the algorithm's elliptic curve. These constants are listed in Appendix A of the NIST publication, but nowhere is it explained where they came from.

What Shumow and Ferguson showed is that these numbers have a relationship with a second, secret set of numbers that can act as a kind of skeleton key. If you know the secret numbers, you can predict the output of the random-number generator after collecting just 32 bytes of its output. To put that in real terms, you only need to monitor one TLS internet encryption connection in order to crack the security of that protocol. If you know the secret numbers, you can completely break any instantiation of Dual_EC_DRBG.

The researchers don't know what the secret numbers are. But because of the way the algorithm works, the person who produced the constants might know; he had the mathematical opportunity to produce the constants and the secret numbers in tandem.

Of course, we have no way of knowing whether the NSA knows the secret numbers that break Dual_EC-DRBG. We have no way of knowing whether an NSA employee working on his own came up with the constants -- and has the secret numbers. We don't know if someone from NIST, or someone in the ANSI working group, has them. Maybe nobody does.

We don't know where the constants came from in the first place. We only know that whoever came up with them could have the key to this backdoor. And we know there's no way for NIST -- or anyone else -- to prove otherwise.

This is scary stuff indeed.

Even if no one knows the secret numbers, the fact that the backdoor is present makes Dual_EC_DRBG very fragile. If someone were to solve just one instance of the algorithm's elliptic-curve problem, he would effectively have the keys to the kingdom. He could then use it for whatever nefarious purpose he wanted. Or he could publish his result, and render every implementation of the random-number generator completely insecure.

It's possible to implement Dual_EC_DRBG in such a way as to protect it against this backdoor, by generating new constants with another secure random-number generator and then publishing the seed. This method is even in the NIST document, in Appendix A. But the procedure is optional, and my guess is that most implementations of the Dual_EC_DRBG won't bother.

If this story leaves you confused, join the club. I don't understand why the NSA was so insistent about including Dual_EC_DRBG in the standard. It makes no sense as a trap door: It's public, and rather obvious. It makes no sense from an engineering perspective: It's too slow for anyone to willingly use it. And it makes no sense from a backwards-compatibility perspective: Swapping one random-number generator for another is easy.

My recommendation, if you're in need of a random-number generator, is not to use Dual_EC_DRBG under any circumstances. If you have to use something in SP 800-90, use CTR_DRBG or Hash_DRBG.

In the meantime, both NIST and the NSA have some explaining to do.

random musings...

Posted: Thu Sep 05, 2020 6:27 pm
by Pattern_Juggled
Radicali wrote:I read an article not long ago about some of the latest encryptions. The eliptic curve had constants that couldnt be accounted for. The article suggested it looked like a back door. I'm rather novice, so i may not have understood the whole article, but it seemed pretty clear about that part.

This sounds like the old story of the dodgy random number generator championed as a "standard" by the NSA.

Broken randomization algorithms are as old as cryptography, and will most likely never go away. Indeed, since the definition of "random" is itself heavily contested - and perhaps forever immutable - it's unlikely this problem can ever be fully solved.

If you ever want to bring a room-full of systems theorists or computational complexity researchers to blows, just ask the innocent question: what's an example of a "perfectly random" data set? Watch the fists fly. :eh:

The best exposition I've seen recently on the topic, in addition to the link above, can be found in Chaitin's Meta Math - an excellent, compact, up-to-date volume that touches on all manner of fascinating mathematical challenges.

That said, the NSA's backdoored, ECC-based random generator is known-bad and afaik isn't used by any credible crypto practitioners... although I remember a rumour it had been adopted by Microsoft at some point. Which does actually make sense - note inclusion of the word "credible," above. :-P


Re: Production stats: updates on "go live" of cryptostorm

Posted: Fri Sep 06, 2020 12:19 pm
by guest ... rveillance

Bruce Schneier said just yesterday to avoid elliptic curve. Based on his review of recent leaks. Am I missing something?

Also I thought elliptic was propritary? (blackberry/rim)?

You mentioned a custom kernal- have you moved to a libre setup, or are you still on CentOS? how do you handle blobs, drivers, hardware and such?


Posted: Fri Sep 06, 2020 1:26 pm
by Pattern_Juggled
Pattern_Juggled wrote:That said, the NSA's backdoored, ECC-based random generator is known-bad and afaik isn't used by any credible crypto practitioners... although I remember a rumour it had been adopted by Microsoft at some point. Which does actually make sense - note inclusion of the word "credible," above. :-P

In an irony of timing, this old ham-fisted effort on the part of the NSA to inject a backdoored random number generator into a new NIST standard (as described in above post) has been confirmed via documents released by Snowden. Said confirmation taking place... today.

Quoting from the above-linked NY Times summary:

Simultaneously, the N.S.A. has been deliberately weakening the international encryption standards adopted by developers. One goal in the agency’s 2013 budget request was to “influence policies, standards and specifications for commercial public key technologies,” the most common encryption method.

Cryptographers have long suspected that the agency planted vulnerabilities in a standard adopted in 2006 by the National Institute of Standards and Technology and later by the International Organization for Standardization, which has 163 countries as members.

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”

“Eventually, N.S.A. became the sole editor,” the memo says.

So... there you go. It was a backdoor - just as it appeared to be. It certainly doesn't seem like the NSA did such a great job in overcoming the "challenge in finesse" involved in that process, however, since nobody has trusted that standard ever since; we've all assumed (correctly) that it was backdoored.

I did see an article someplace that Microsoft has put that random generator into practice... but I can't recall where I saw it. When I bump into it again - or if someone else reading this has the cite handy - I'll post it up here.


How to remain secure against NSA surveillance

Posted: Fri Sep 06, 2020 1:44 pm
by Pattern_Juggled
guest wrote:Bruce Schneier said just yesterday to avoid elliptic curve. Based on his review of recent leaks. Am I missing something?

Breaking up replies to this into chunks. But first, I think it's worth echoing over Bruce's entire Guardian piece to aid in discussion here. It's an absolutely fascinating essay:

How to remain secure against NSA surveillance
The NSA has huge capabilities – and if it wants in to your computer, it's in. With that in mind, here are five ways to stay safe
Bruce Schneier | | Thursday 5 September 2020 20.06 BST

Now that we have enough details about how the NSA eavesdrops on the internet, including today's disclosures of the NSA's deliberate weakening of cryptographic systems, we can finally start to figure out how to protect ourselves.

For the past two weeks, I have been working with the Guardian on NSA stories, and have read hundreds of top-secret NSA documents provided by whistleblower Edward Snowden. I wasn't part of today's story – it was in process well before I showed up – but everything I read confirms what the Guardian is reporting.

At this point, I feel I can provide some advice for keeping secure against such an adversary.

The primary way the NSA eavesdrops on internet communications is in the network. That's where their capabilities best scale. They have invested in enormous programs to automatically collect and analyze network traffic. Anything that requires them to attack individual endpoint computers is significantly more costly and risky for them, and they will do those things carefully and sparingly.

Leveraging its secret agreements with telecommunications companies – all the US and UK ones, and many other "partners" around the world – the NSA gets access to the communications trunks that move internet traffic. In cases where it doesn't have that sort of friendly access, it does its best to surreptitiously monitor communications channels: tapping undersea cables, intercepting satellite communications, and so on.

That's an enormous amount of data, and the NSA has equivalently enormous capabilities to quickly sift through it all, looking for interesting traffic. "Interesting" can be defined in many ways: by the source, the destination, the content, the individuals involved, and so on. This data is funneled into the vast NSA system for future analysis.

The NSA collects much more metadata about internet traffic: who is talking to whom, when, how much, and by what mode of communication. Metadata is a lot easier to store and analyze than content. It can be extremely personal to the individual, and is enormously valuable intelligence.

The Systems Intelligence Directorate is in charge of data collection, and the resources it devotes to this is staggering. I read status report after status report about these programs, discussing capabilities, operational details, planned upgrades, and so on. Each individual problem – recovering electronic signals from fiber, keeping up with the terabyte streams as they go by, filtering out the interesting stuff – has its own group dedicated to solving it. Its reach is global.

The NSA also attacks network devices directly: routers, switches, firewalls, etc. Most of these devices have surveillance capabilities already built in; the trick is to surreptitiously turn them on. This is an especially fruitful avenue of attack; routers are updated less frequently, tend not to have security software installed on them, and are generally ignored as a vulnerability.

The NSA also devotes considerable resources to attacking endpoint computers. This kind of thing is done by its TAO – Tailored Access Operations – group. TAO has a menu of exploits it can serve up against your computer – whether you're running Windows, Mac OS, Linux, iOS, or something else – and a variety of tricks to get them on to your computer. Your anti-virus software won't detect them, and you'd have trouble finding them even if you knew where to look. These are hacker tools designed by hackers with an essentially unlimited budget. What I took away from reading the Snowden documents was that if the NSA wants in to your computer, it's in. Period.

The NSA deals with any encrypted data it encounters more by subverting the underlying cryptography than by leveraging any secret mathematical breakthroughs. First, there's a lot of bad cryptography out there. If it finds an internet connection protected by MS-CHAP, for example, that's easy to break and recover the key. It exploits poorly chosen user passwords, using the same dictionary attacks hackers use in the unclassified world.

As was revealed today, the NSA also works with security product vendors to ensure that commercial encryption products are broken in secret ways that only it knows about. We know this has happened historically: CryptoAG and Lotus Notes are the most public examples, and there is evidence of a back door in Windows. A few people have told me some recent stories about their experiences, and I plan to write about them soon. Basically, the NSA asks companies to subtly change their products in undetectable ways: making the random number generator less random, leaking the key somehow, adding a common exponent to a public-key exchange protocol, and so on. If the back door is discovered, it's explained away as a mistake. And as we now know, the NSA has enjoyed enormous success from this program.

TAO also hacks into computers to recover long-term keys. So if you're running a VPN that uses a complex shared secret to protect your data and the NSA decides it cares, it might try to steal that secret. This kind of thing is only done against high-value targets.

How do you communicate securely against such an adversary? Snowden said it in an online Q&A soon after he made his first document public: "Encryption works. Properly implemented strong crypto systems are one of the few things that you can rely on."

I believe this is true, despite today's revelations and tantalizing hints of "groundbreaking cryptanalytic capabilities" made by James Clapper, the director of national intelligence in another top-secret document. Those capabilities involve deliberately weakening the cryptography.

Snowden's follow-on sentence is equally important: "Unfortunately, endpoint security is so terrifically weak that NSA can frequently find ways around it."

Endpoint means the software you're using, the computer you're using it on, and the local network you're using it in. If the NSA can modify the encryption algorithm or drop a Trojan on your computer, all the cryptography in the world doesn't matter at all. If you want to remain secure against the NSA, you need to do your best to ensure that the encryption can operate unimpeded.

With all this in mind, I have five pieces of advice:

    1) Hide in the network. Implement hidden services. Use Tor to anonymize yourself. Yes, the NSA targets Tor users, but it's work for them. The less obvious you are, the safer you are.

    2) Encrypt your communications. Use TLS. Use IPsec. Again, while it's true that the NSA targets encrypted connections – and it may have explicit exploits against these protocols – you're much better protected than if you communicate in the clear.

    3) Assume that while your computer can be compromised, it would take work and risk on the part of the NSA – so it probably isn't. If you have something really important, use an air gap. Since I started working with the Snowden documents, I bought a new computer that has never been connected to the internet. If I want to transfer a file, I encrypt the file on the secure computer and walk it over to my internet computer, using a USB stick. To decrypt something, I reverse the process. This might not be bulletproof, but it's pretty good.

    4) Be suspicious of commercial encryption software, especially from large vendors. My guess is that most encryption products from large US companies have NSA-friendly back doors, and many foreign ones probably do as well. It's prudent to assume that foreign products also have foreign-installed backdoors. Closed-source software is easier for the NSA to backdoor than open-source software. Systems relying on master secrets are vulnerable to the NSA, through either legal or more clandestine means.

    5) Try to use public-domain encryption that has to be compatible with other implementations. For example, it's harder for the NSA to backdoor TLS than BitLocker, because any vendor's TLS has to be compatible with every other vendor's TLS, while BitLocker only has to be compatible with itself, giving the NSA a lot more freedom to make changes. And because BitLocker is proprietary, it's far less likely those changes will be discovered. Prefer symmetric cryptography over public-key cryptography. Prefer conventional discrete-log-based systems over elliptic-curve systems; the latter have constants that the NSA influences when they can.

Since I started working with Snowden's documents, I have been using GPG, Silent Circle, Tails, OTR, TrueCrypt, BleachBit, and a few other things I'm not going to write about. There's an undocumented encryption feature in my Password Safe program from the command line); I've been using that as well.

I understand that most of this is impossible for the typical internet user. Even I don't use all these tools for most everything I am working on. And I'm still primarily on Windows, unfortunately. Linux would be safer.

The NSA has turned the fabric of the internet into a vast surveillance platform, but they are not magical. They're limited by the same economic realities as the rest of us, and our best defense is to make surveillance of us as expensive as possible.

Trust the math. Encryption is your friend. Use it well, and do your best to ensure that nothing can compromise it. That's how you can remain secure even in the face of the NSA.

Elliptic Curve Cryptography (Linux Journal)

Posted: Fri Sep 06, 2020 2:09 pm
by Pattern_Juggled
{importing in a nice backgrounder on ECC, although this NSA puff-piece is also pretty funny/sad to read given their full-scale subversion of NIST ~pj}

Elliptic Curve Cryptography
Apr 08, 2020 | Joe Hendrix | Linux Journal

When it comes to public key cryptography, most systems today are still stuck in the 1970s. On December 14, 2020, two events occurred that would change the world: Paramount Pictures released Saturday Night Fever, and MIT filed the patent for RSA. Just as Saturday Night Fever helped popularize disco through its choreography and soundtrack, RSA helped popularize cryptography by allowing two parties to communicate securely without a shared secret.

Public key techniques, such as RSA, have revolutionized cryptography and form the basis for Web site encryption via SSL/TLS, server administration via SSH, secure e-mail and IP encryption (IPsec). They do this by splitting the shared secret key used in traditional cryptography into two parts: a public key for identifying oneself and a secret key for proving an identity electronically. Although the popularity of disco has waned, most Web sites today that use encryption still are using RSA.

Since the 1970s, newer techniques have been developed that offer better security with smaller key sizes than RSA. One major breakthrough is the development of cryptography based on the mathematical theory of elliptic curves, called ECC (Elliptic Curve Cryptography). Although ECC has a reputation for being quite complex, it has been integrated into popular open-source cryptographic software including OpenSSH and OpenSSL, and it's not inherently any more difficult to use than RSA. In this article, I describe ECC and show how it can be used with recent versions of OpenSSH and OpenSSL.

Not all cryptographic algorithms are equal. For a fixed key or output length, one algorithm may provide much more security than another. This is particularly true when comparing different types of algorithms, such as comparing public and symmetric key algorithms. To help make sense of this, the National Institute of Standards and Technology (NIST) reviews the academic literature on attacking cryptographic algorithms and makes recommendations on the actual security provided by different algorithms (see Table 1 from 2011).

Table 1. NIST Recommended Key Sizes

Bits of Security Symmetric Key Algorithm Corresponding Hash Function Corresponding RSA Key Size Corresponding ECC Key Size

80 Triple DES (2 keys) SHA-1 1024 160
112 Triple DES (3 keys) SHA-224 2048 224
128 AES-128 SHA-256 3072 256
192 AES-192 SHA-384 7680 384
256 AES-256 SHA-512 15360 512

Note: for new applications, I think AES-128 should be used over triple DES even if 128-bit security isn't needed. Attacks have been found on SHA-1, and NIST now estimates that SHA-1 provides only 69 bits of security in digital signature applications.

If the system you are designing is expected to protect information only until 2030, NIST recommends that you use cryptography providing at least 112 bits of security. For applications that need longer-term protection, NIST recommends at least 128 bits of security.

Department of Defense Requirements

Although NIST guidance is well respected, the Department of Defense has stronger requirements for classified information. For the Defense Department, 128 bits is only good enough for protecting information classified SECRET. Use of RSA isn't approved, and TOP SECRET information requires use of AES-256, SHA-384 and ECC with a 384-bit key size. Furthermore, systems must use two separate encryption implementations for protection. For example, use both IPsec and TLS, so that the information is still protected by one layer if a flaw in the other is found. Although this may not be very practical for most Internet applications, it's interesting to see what the requirements are when security is paramount.

Just because NIST makes these recommendations, doesn't mean that applications follow them. Many Web sites, including on-line banks, still will use SHA-1 and pair it with AES 128 and a 1024- or 2048-bit RSA key. According to NIST, achieving true 128-bit security means that the RSA key should be at least 3072 bits—a size most Internet certificate authorities don't even offer. At present, Verisign will sell you an SSL certificate that it claims offers "256-bit security", because you can use it with AES-256. The signature itself uses SHA-1 and a 2048-bit RSA key.

At present, the security on the Internet is still sufficiently weak that it almost always will be easier to find a vulnerability that allows an attacker to bypass security rather than directly attack the encryption. However, it is still worthwhile to be aware of how much security the overall encryption implementation provides. In cryptography, more bits are usually better, but an implementation is only as strong as its weakest length. Both ECC and SHA-2 represent essential algorithms to getting real 128-bit or 256-bit security.

The Mathematics of Elliptic Curve Cryptography

Elliptic Curve Cryptography has a reputation for being complex and highly technical. This isn't surprising when the Wikipedia article introduces an elliptic curve as "a smooth, projective algebraic curve of genus one". Elliptic curves also show up in the proof of Fermat's last theorem and the Birch and Swinnerton-Dyer conjecture. You can win a million dollars if you solve that problem.

To get a basic understanding of ECC, you need to understand four things:

The definition of an elliptic curve.

The elliptic curve group.

Scalar multiplication over the elliptic curve group.

Finite field arithmetic.

Essentially, elliptic curves are points on that satisfy an equation with the form:

y2 = x3 + ax + b

Figure 1 shows a picture of an elliptic curve over the real numbers where a is –1 and b is 1. Elliptic curves satisfy some interesting mathematical properties. The curve is symmetric around the x axis, so that if (x,y) is a point on the curve, then (x,–y) is also on the curve. If you draw a line between any two points on the line with different x coordinates, they will intersect the line at a unique third point. Finally, for each point on the curve, if you draw a straight line tangent to the cover from that point, it will intersect the curve once again at another point.

Mathematicians use these properties to form a structure called a group from the points on the elliptic curve. A group consists of a set of elements containing a special point (denoted 0), an operation for negating an element (denoted –x), and an operation for adding two elements (denoted x + y). The elements in the group defined by an elliptic curve consist of the points on the curve plus an additional point for 0 that is not on the curve, but as you'll see below is easiest to visualize as a line on the x-axis. To negate a point, you just negate the y-coordinate of the point, and adding a point to its negation is defined to return 0 (Figure 2). To add two points P and Q with different x-coordinates, draw a line connecting the two points and extending beyond them. This line should intersect the curve at a third point. The sum R = P + Q is the negation of the third point. Finally, to add a point P to itself, draw the line tangent to P (Figure 3). The sum R = 2P is the negation of the point that line intersects (Figure 4).

Once the group is defined, we can talk about scalar multiplication—the fundamental operation that makes elliptic curves useful in cryptography. The kth scalar multiple of P is the point obtained by adding P to itself k times. This can be done efficiently by representing k as a binary number and using a double-and-add multiplication method. If you are familiar with RSA, scalar multiplication plays a similar role in ECC that modular exponentiation plays in RSA.

The real numbers used in diagrams for explaining ECC are not practical to use in actual implementations. Real numbers can have an arbitrary number of digits, and computers have only a finite amount of memory. Most applications, including OpenSSL, use elliptic curves over coordinates that use modular arithmetic, where the modulus is a large prime number. Figure 5 shows the elliptic curve with the same equation as in Figure 1, but where arithmetic is performed modulo 19.

For the different key sizes in Table 1, NIST recommends a specific elliptic curve with a prime modulus for that key size (see the Binary Fields sidebar). For each key size, NIST specifies three things:

The coefficients in the elliptic curve equation.

The size of the underlying field for representing x and y.

A base point that is a point of the curve to be used when calling scalar multiplication.

Binary Fields

For each bit size, NIST also recommends two other elliptic curves over a type of field called a binary field. Although prime fields are more common in software, binary fields are common when implementing ECC in low-power hardware. I focus on prime curves in this article, because that's what OpenSSL uses, and there are a lot more patents on binary curve implementations than prime curves. Unless you have some specific hardware needs and also money to spend on lawyers to deal with patents, I'd recommend sticking to prime curves.

To see how big the numbers for a 256-bit curve are, the NIST P-256 curve equation has the coeffients a=–3 and b = 41058363725152142129326129780047268409114441015993725554835256314039467401291.

The coordinates are in a prime field modulo p_256 where:

p_256 = 2256 – 2224 +2192 +296 – 1

The base point is G=(xG,yG) and defined by:

xG = 48439561293906451759052585252797914202762949526041747995844080717082404635286

yG = 36134250956749795798585127919587881956611106672985015071877198253568414405109

If these numbers look big to you, just think that the 256-bit elliptic curve is equivalent to RSA with 3072-bit numbers. RSA public keys contain more than 12 times the number of digits.

If you'd like to learn more about Elliptic Curve Cryptography, there are many references available. Certicom, a company founded by some of the inventors of ECC, hosts an on-line tutorial at For a more comprehensive understanding of cryptography, the book Understanding Cryptography by Christof Paar, Jan Pelzl and Bart Preneel has a chapter about ECC and also covers the AES and SHA. I've just touched the basic definitions here, and I've not discussed the optimizations used to make a high-performance implementation like the one in OpenSSL. For a quite comprehensive reference on fast ECC algorithms, the "Handbook of Elliptic and Hyperelliptic Curve Cryptography" ( has yet to let me down.

Using Elliptic Curve Cryptography in OpenSSH

A little more than a year ago, OpenSSH 5.7 added support for ECC-based cryptography. Although it's still not in every Linux distribution, support for ECC finally is becoming widespread enough that it's starting to be worth considering a migration. Support for ECC requires OpenSSH version 5.7 or later and OpenSSL version 0.9.8g or later. OpenSSH can use ECC both to help you authenticate that you really are talking to the server you want and to help the server perform key-based authentication of users.

Host authentication is used by the client to authenticate the server. It is used to detect man-in-the-middle attacks and normally is set up automatically and used by OpenSSH. When OpenSSH is installed, it should create one or more host keys, which normally are stored in /etc/ssh. The ECC private key normally is named ssh_host_ecdsa_key, and the corresponding public key normally is named See the man pages for sshd_config if you would like to change this path. Just make sure that the private key can be read only by authorized admins; anybody with access to the host private key potentially could impersonate the server.

Client authentication is used to authenticate the client against the server. Using keys to authenticate rather than passwords is both more convenient (because you can use ssh-agent or another program to cache the key) and more secure (because the password is never sent in plain text to the server). If you have used SSH for significant work in the past, you've probably set this up using RSA keys, and the exact same process, namely using ssh-keygen, is used to create ECC keys. The only difference is to pass -tecdsa to create the key. The man page for ssh-keygen will have more details, and there are many tutorials for setting up SSH keys available on-line if you need a walk-through.

For most people, once encryption software supporting ECC is more widely deployed, converting to ECC should be quick and painless. RSA still is probably "good enough" for most applications, but ECC is significantly more secure, and it may be essential to getting strong security on tiny, low-power, networked devices that are becoming more widespread. Its introduction into open-source tools like OpenSSL and OpenSSH is definitely a great step toward gaining more widespread use.


Joe Hendrix is a security researcher who works in Portland, Oregon, for Galois, Inc. His main interest is in applying formal verification techniques to real security problems.

ECC & cryptographic engineering

Posted: Fri Sep 06, 2020 2:44 pm
by Pattern_Juggled
guest wrote:Bruce Schneier said just yesterday to avoid elliptic curve. Based on his review of recent leaks. Am I missing something?

Also I thought elliptic was propritary? (blackberry/rim)?

ECC certainly isn't proprietary - any more than DH or RSA, respectively, are. All three are simply forms of mathematical transforms that are difficult/"difficult" to reverse - DH & RSA end up being roughly equivalent in terms of their underlying number-theoretic foundations; ECC is... a little bit different. No, it's not "quantum computer proof" as some folks suggest - but we can say that it's not (directly) vulnerable to Shor's algorithm as instantiated on existing qubit-based machines (both in theory and in practice).

There are proprietary implementations of ECC - just as with DH and RSA - but that's a different issue. At a mathematical level, these three are all best considered techniques.

Let me rewind back to my first/only reference to elliptic maths in this thread. Quoting:

The production work, if I may summarize, seems to be moving along nicely. There's been some wrestling with the elliptic curve stuff to implement a flavour of TLS that's not useless... and that's a more involved process than it would first appear. Many kernel recompiles later, we're getting the hang of it - apparently. :eh:

Unpacking this somewhat, here's the deal: TLS is broken.

TLS (which is what we're supposed to call SSL, but nobody really does so I'm just going to use SSL like everyone else) is what makes https httpS. And saying it's broken is dumb, because it's actually a framework with all sorts of available options; the options are selected in negotiation between the web browser and a particular server. If those two negotiate a bad cipher selection, or a bad implementation of same - or both - during a session, then SSL will be easy to break by any attacker with access to the cipher text.

Currently, lots of web servers default to TLS parameters that suck. And most browsers will accept those parameters, painting the reassuring-looking lock-logo in the address bar line... making people feel falsely secure. Now, it's possible actually to make your web server demand that the browser choose a good cipher for the session - no good cipher, and the server turns its nose up and refuses the connection.

For some circumstances, being picky like that - if you're running the server - is well worth it.

The ciphers used in this process are part of the OpenSSL library - nobody uses stuff outside the library, afaik, in production systems. As such, you basically tell your web server "preferentially make use of the following ciphers from the toolkit made available to you via OpenSSL, when you get a request for a secure session from a new web browser" - and it picks from the available list.

What ciphers are available to the kernel - and hence the web server - are determined by how many are rolled into the build of the OpenSSL libraries being used. Our decision is to call from a wide toolkit, so that our servers are able to make use of them in negotiating SSL connections.

Where this all actually leads is proper implementation of perfect forward secrecy (PFS) fo HTTPS sessions. And it turns out that, in order to offer PFS to visitors whose browsers can handle it, you have to make sure your kernel has all the requisite OpenSSL goodies baked into it already - or you have to recompile your kernel to include them. So that's what we've been doing, as offering PFS - indeed, requiring PFS - is a high priority for us.

Who cares about HTTPS anyway, since it's just a damned website: (or this forum, OpenVPN makes use of HTTPS/SSL when it's doing setup of secure sessions... for the VPN connections themselves! So doing SSL right - and avoiding broken SSL configurations - actually matters alot if you want to run a competent VPN service. Hence the fiddling with the kernel to get the full suite of ciphers into it, and so on, which I referenced above.

What about ECC - why even use them? Simple: ceteris paribus, Schneier encourages us to prefer "conventional" (i.e. RSA/DH... actually he suggests DH over RSA) public key exchange techniques over ECC. Which I agree with; hell, I agree with 99% of what Schneier says and consider him functionally canonical wrt crypto implementation questions. However, if we're forced outside of ceteris paribus, what then?

Clearly, it's better to rely on a competent ECC implementation of keylength X rather than DH with keylength Y for some values of X and Y. DH below certain levels of key entropy is trivially easy to brute-force - as is any mathematical transform, essentially. And even if ECC's constants (and/or randomization techniques to generate initial keys) are subtly fuzzed by the NSA, pushing them up to a big number makes them better than a tiny key for DH (which actually isn't true - it depends on how fuzzed things are, which we don't actually know - yet - and which we have to make assumptions about, etc.)

Basically, sometimes it's better to use ECC if a given client doesn't have anything else that's properly configured for use in setting up a SSH session. Which is to say that having ECC in one's server-side toolkit is useful... particularly when it comes to PFS, the implementation of which (in certain circumstances) positively requires access to ECC ciphers. So there're available in our standardised kernel build.

The ironic thing, to me anyhow, is that this is all related to public key cryptography... which is itself essentially focussed on two people exchanging data and proving they are who they say they are to each other after already having securely exchanged symmetric keys via some other "secure" channel. And, when it comes to token-based authentication, we've moved away from a heavy emphasis on PKI-based auth as part of VPN sessions - in favor of, yep, symmetric keys.

You mentioned a custom kernal- have you moved to a libre setup, or are you still on CentOS? how do you handle blobs, drivers, hardware and such?

I certainly wouldn't recommend genuinely "custom" kernels for much in the way of production systems. When I use the generic term "custom," I'm referencing tweaks to off-the-shelf kernels, not actual full-blown custom OS guts. <shivers>

That said, yes we do get fairly aggressive in tweaking our production Linux kernels for exit nodes - that's always been true to one degree or another, as you know. Some things that aren't needed are stripped out; other capabilities that often aren't part of standard builds we add in manually, pre-compile. Having years worth of experience in optimizing exit nodes, our admin team has some precise preferences when it comes to kernel inclusions.


Re: ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Tue Sep 10, 2020 2:31 pm
by Pattern_Juggled
Here's the original NIST 1999 paper on specifying curve parameters. It's an interesting read - both technically, and in terms of the way it's phrased and the tricksy underpinnings we now know were there all along...

July 1999

(1.45 MiB) Downloaded 398 times

Re: ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Wed Sep 11, 2020 5:42 am
by Guest
Tricksy underpinnings and phrazing would probably go over my head, but I'd be curious to read a knowlegeable persons thoughts on such. If you have time, and it could be explained without too much complex mathmatics. I have trouble understanding this stuff.

Re: ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Fri Sep 27, 2020 4:50 pm
by guest
{apologies for delayed post approval; network rollout deathmarch ~admin}

Recently while reading an article on on some of the recent crypto revelations I came across a link to a NIST list of companies using the bunk ECC algorithm. Since having heard plenty about them leveraging companies and weakening the crypto but very little about who's may be subject, I thought this may be of some interest to those who haven't seen it. Also, I just want to say thanks for fighting for our right to privacy.

the article:

and the list around halfway through the piece

Posted: Fri Oct 18, 2020 12:25 am
by Pattern_Juggled
Here's a badass resource for choosing and implementing proper ECC curves:

SafeCurves: choosing safe curves for elliptic-curve cryptography

There are several different standards covering selection of curves for use in elliptic-curve cryptography (ECC):

    ANSI X9.62 (1999).
    IEEE P1363 (2000).
    SEC 2 (2000).
    NIST FIPS 186-2 (2000).
    ANSI X9.63 (2001).
    Brainpool (2005).
    NSA Suite B (2005).
    ANSSI FRP256V1 (2011).

Each of these standards tries to ensure that the elliptic-curve discrete-logarithm problem (ECDLP) is difficult. ECDLP is the problem of finding an ECC user's secret key, given the user's public key.

Unfortunately, there is a gap between ECDLP difficulty and ECC security. None of these standards do a good job of ensuring ECC security. There are many attacks that break real-world ECC without solving ECDLP. The core problem is that if you implement the standard curves, chances are you're doing it wrong:

    Your implementation produces incorrect results for some rare curve points.
    Your implementation leaks secret data when the input isn't a curve point.
    Your implementation leaks secret data through branch timing.
    Your implementation leaks secret data through cache timing.

These problems are exploitable by real attackers, taking advantage of the gaps between ECDLP and real-world ECC:

    ECDLP is non-interactive. Real-world ECC handles attacker-controlled input.
    ECDLP reveals only nP. Real-world ECC also reveals timing (and, in some situations, much more side-channel information).
    ECDLP always computes nP correctly. Real-world ECC has failure cases.

Secure implementations of the standard curves are theoretically possible but very hard.

Most of these attacks would have been ruled out by better choices of curves that allow simple implementations to be secure implementations. This is the primary motivation for SafeCurves. The SafeCurves criteria are designed to ensure ECC security, not just ECDLP security.

Other attacks would have been ruled out by better choices at higher levels of ECC protocols. For example, deterministic nonces were proposed in 1997, are integrated into modern signature mechanisms such as EdDSA, and would have prevented the 2010 Sony PlayStation ECDSA security disaster. However, this security issue does not interact with curve choices, so it is outside the scope of SafeCurves.

Error-prone cryptographic designs

Posted: Sat Jan 10, 2020 3:22 pm
by Pattern_Juggled
Error-prone cryptographic designs

Daniel J. Bernstein
University of Illinois at Chicago
Technische Universiteit Eindhoven

“The poor user is given enough rope with which
to hang himself—something a standard should not do.”
—1992 Rivest

...commenting on nonce generation inside Digital Signature Algorithm (1991 proposal by NIST, 1992 credited to NSA, 1994 standardized by NIST)

(89.49 KiB) Downloaded 305 times

Re: ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Mon Jan 12, 2020 3:36 am
by marzametal
Interesting PDF's - CryptoMe

The new reloaded Silk Road has switched from Tor to this - I2P

Re: ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Mon Jan 12, 2020 3:37 pm
by DesuStrike
marzametal wrote:The new reloaded Silk Road has switched from Tor to this - I2P

I'm always wary of security and privacy build on java and such. I know this is a bold statement from someone who uses an android phone but the Java VM has a remarkable track record of security nightmares. :think:

Re: ECC, cryptostorm, Schneier, NSA, & crypto ops 101

Posted: Tue Jan 13, 2020 2:21 am
by marzametal
Bloody hell, Java in general is blargh!!!