(Cross-posted at Medium)
Even if you’re not a patent lawyer, you’ve probably noticed that patents have been in the news more. The growing problem of patent trolls, companies who make their money by suing other companies for patent infringement, has been the primary reason. Patent trolls now account for nearly two-thirds of all patent infringement lawsuits, draining billions of dollars away from productive companies.
One big reason for patent trolls’ success has been the USPTO issuing a large number of vague patents. These can be used to claim ownership of things like the use of a shopping cart on a website, scanning documents and emailing them from the scanner, and even the use of WiFi at a coffee shop. Most of these patents should never have been issued. Although it is possible to invalidate a patent after the fact, that process costs hundreds of thousands, or even millions, of dollars — enough to bankrupt small companies.
Why Are There So Many Poor Quality Patents?
The Government Accountability Office recently released two reports covering patent quality vs incentives and research capabilities that show why the USPTO issues poor quality patents in the first place. The big problem is that the USPTO has been focusing on quantity, not quality, for many years. Under Director Michelle Lee, there has been a new drive to improve patent quality, but it’s been hampered by procedures and systems put in place long before she arrived.
Patent examiners, those people who review applications and decide which patents to issue, are generally measured based on how much time they spend reviewing each application. The faster, the better. Although there is a “patent quality” component to their performance ratings, examiners’ pay and promotions are based almost entirely on their speed and efficiency.
The GAO found there is no consistent definition of patent quality at the USPTO. Without such a definition, it’s nearly impossible to develop standard practices to produce better patents. As a result, examiners have no clear performance measures with respect to quality — just very specific production goals to meet. No wonder they focus on production goals.
It turns out part of the reason for the lack of a “patent quality” performance measure is the USPTO’s outdated information technology infrastructure. I’ve met with USPTO officials, and they expressed frustration that they simply could not gather the data they need. It’s just not in the system and there’s no easy way to capture it. To the USPTO’s credit, it is developing a new system, and enlisted the Department of Commerce for expertise as well. Still, that new system is a way off.
Furthermore, it’s not easy to do a complete job when you’re under a lot of time pressure. The GAO found that 70% of patent examiners day they don’t have enough time allocated to do their jobs thoroughly. In fact, nearly 70% of examiners worked voluntary (i.e., unpaid) overtime just to keep up. The GAO found a number of factors contribute to this time pressure and make it more difficult to improve patent quality.
First, patent clarity is a big problem. The GAO estimates 90% of examiners always or often encounter broadly worded patent applications, and two-thirds of examiners say this makes it difficult to complete a thorough examination. But even though patent clarity is a real problem, patent examiners, particularly in software, are discouraged from forcing applicants to fix unclear claims. To see what I mean about clarity, here’s an example from the GAO quality report (p. 9):
[O]ne claim for a cardboard coffee cup insulator begins by referring to “a recyclable, insulating beverage container holder, comprising a corrugated tubular member comprising cellulosic material and at least a first opening therein for receiving and retaining a beverage container.”
That kind of language is not exactly easy for laypeople — or even patent lawyers — to read. Patent examiners have the ability to reject claims as unclear or poorly specified using what are called Section 112 rejections. But the GAO found that there is actually substantial pressure to avoid using Section 112 rejections. About 42% of examiners surveyed said that they felt at least a little pressure to avoid 112 rejections, and over a third of those examiners felt moderate to a lot of pressure. But it gets worse in the software patent technology center.
This matters because software patents are the main weapon of choice for patent trolls; they tend to be broader than they should be and/or overly vague. If a patent is unclear, it will be much more expensive to defend against, because a defendant will have to spend a lot of time (and money) arguing about just what the patent means. This creates additional pressure to settle.
Unfortunately, software patent examiners feel more pressure than other tech patent examiners to avoid 112 rejections. The GAO survey found that over half (52.6%) of software examiners felt at least a little pressure to avoid 112 rejections, and over a third (37.6%) felt moderate to a lot of pressure. In other words, the group of examiners that should be focusing the most on clear claims feel the most pressure to avoid doing that.
Another issue for examiners is that there’s no limit to the size of a patent application. A patent claim is the text that describes what the inventor can block others from doing. It’s critical that each patent claim be clear, new, and not just an obvious variant of something else. But many patent applications have dozens of claims to examine, and a patent examiner gets the same amount of time to examine an application with 100 claims as an application with just 1 claim. This is true even though the applicant pays extra to have more than 20 claims in the patent application.
To make things worse, the USPTO cannot get rid of patent applications except by issuing them as patents. An applicant can choose to abandon an application, but it doesn’t have to. If it wants, an applicant can keep the patent application going forever by just paying more fees and filing requests for continued examination. Examiners get a fixed amount of time per application, no matter how many requests for continued examination they have to deal with.
Eventually, the sheer volume of applications on an examiner’s docket will create pressure to allow applications just to get some breathing room. Indeed, the GAO found that around 45% of examiners felt at least some degree of pressure to allow more patents. The pressure is worse in software patents, where 50.2% of examiners felt pressured to allow more patents.
The GAO identified other problems as well. The GAO dedicated one of the reports to USPTO’s lack of good tools for searching for prior art. Ideally, if something has been done before, it was described somewhere in another patent or publication. Patent offices in Japan and Europe have relatively modern searching tools that let their examiners search a wide variety of databases with a single integrated search. Their tools are also intelligent enough to identify relevant documents that might not use exactly the same words. US patent examiners have nothing like that. To do a thorough search they have to use multiple databases (which are generally not intelligent databases), and they have no easy way to get translations of foreign language documents. The USPTO is developing an integrated search tool, but that will take time.
The GAO reports shine light on some areas where the USPTO can make a big difference in patent quality, and the GAO’s recommendations should be adopted in full. The current administration is showing a strong desire to improve quality. It will take time and a lot of work, but it’s crucial to innovators and the U.S. economy that this effort succeeds.