Thanks to giorgi chubinidze, Luke melley, AJ Munoz, Bobby Best, and Dennis Clark for becoming Knife Steel Nerds Patreon supporters!

Thanks to the guys at Knife Talk podcast for featuring me in the “Community Showcase.” Listen here.

Introduction

I get email updates when new metallurgy journals are released (yes I am really that committed). To my surprise I saw a new knife-related article released in the latest issue of Journal of Materials Engineering and Performance [1]. The article is called, “Effect of Primary Carbides on the Sharpness of Kitchen Knives Made of 8Cr13MoV Steel.” Now that’s an interesting subject! Comparing potential sharpness of different steels, heat treatments, knives, etc. is very difficult because it’s hard to take sharpening skill out of the equation. I have not attempted to do any such tests to compare potential sharpness based on processing or steel so I was curious how these researchers did it.

How They Sharpened

I was definitely interested in how they ensured the sharpening was consistent and eliminated human variables. This is how they described the sharpening process: “The shape of cutting edge is straight, and the minimum thickness of cutting edge (TCE) is 3.5 ± 0.5 µm. The angle of cutting edge (ACE) was 38°, and the relevant parameters are shown in Fig. 1. The test knives were sharpened by sandstone which mainly contained SiO2 and some sticky material, such as clay and carbon. The cooling agent was saline water which contained 15 wt.% salt.”

OK, so they had a fixed edge angle of 19° per side, and an average edge tip thickness of 3.5 microns. However, they did not state how the angle was set or what type of sharpening apparatus was used. Just that it was sharpened on sandstone which contained silica and some “sticky material.” No information on grit of the sharpening media, deburring, almost nothing. Was it sharpened by hand? A robot? With CATRA sharpening equipment? I have no idea how they sharpened the knives.

How They Tested Sharpness

They tested the knives with a CATRA edge retention tester. I introduced the CATRA in this article and this article. CATRA does not test sharpness, it measures cutting ability. For a fixed edge geometry, cutting ability can be used as a proxy for sharpness. However, the CATRA edge retention tester cannot measure the cutting ability of the “fresh” edge but only the edge after it has seen some wear, I explained why in this article. Each “cut” on the CATRA test is both a forward and backward stroke, so even cut #1 cannot be used to test sharpness because the initial stroke will wear the edge reducing the sharpness of the second stroke. Therefore, despite the title of the article, these researchers did not test sharpness at all. They tested edge retention. CATRA actually does make a real sharpness tester but it was not used in this particular study.

The Processing Variables that they Tested

The researchers produced their own 8Cr13MoV steel using vacuum induction melting, electroslag remelting, hot rolling, spheroidize annealing, cold rolling, recrystallization annealing, quenching and tempering. Many of the details of their processing were not provided. However, there were 2 groups tested.

The process as shown above was performed including hot rolling followed by spheroidize annealing, cold rolling, etc. Before the spheroidize annealing process the steel was heated to 1180°C (2150°F) for either 30, 60, 90, or 120 minutes

This processing was performed in an attempt to reduce the amount of primary carbides. Primary carbides are those that form during solidification of the steel and they are typically large. Large carbides reduce toughness and other properties. They found that the “diffusion annealing” process at 1180°C was effective at reducing the amount of primary carbide:

All of the steel was austenitized at 1050°C (1925°F) for 10 minutes, air cooled, and tempered at 180°C (350°F). However, they also tested a process they called “roll forging” where they austenitized the steel at 1050°C for 10 minutes, then air cooled the steel to 750°C (1375°F) and rolled the steel down from 2.5 mm to 1 mm. The steel was then given the final austenitize, quench and temper. That process is shown below:

They have some microstructures shown in the article at different stages including (a) after electroslag remelting, (b) hot-rolled plate, (c) after spheroidize annealing, and (d) cutting edge of a knife. I don’t know what they mean by cutting edge. Does that mean after the final heat treatment? Is this literally at the very edge on the surface? Also these are clearly etched samples to show the carbides but no etchant or etching procedure is described in the article.

Results of “Roll Forging” Comparison

The researchers performed 4 tests of each condition to get a good average. They found that the CATRA edge retention was increased from 218 mm to 279.4 mm by employing the “roll forging” process, an increase of 28%. They attributed this improvement to differences in the primary carbide structure. Their reasoning for this was something I have not seen before, and requires some explanation and discussion. See the example curves below for (a) tranditional process and (b) RF process:

In CATRA testing there are often variations in each “cut” where one cut will be lower than another as the edge wears throughout the test. These variations are difficult or impossible to avoid. However, they claim that the the roll forging process led to less variation between cuts and this is what provided the benefit. They then provided a hypothesis that the on one cut the carbides are pulled out leaving “pits” in the edge, increasing the frction of the edge and that the sharpness would drop because of it. In the next cut the pits would be worn off so that in the next cut would have lower friction, and then the cycle would repeat. They provided these micrographs as evidence, with (a) being after one cut and (b) being after the second:

They stated that these images were taken from “the cutting edge” but with not further information at what position in the edge this means. I don’t really see a clear difference between the two images. There are a couple larger pits in the first image but that could just be because of the location that was chosen. There are pits in both images so this does not provide evidence that the pits are being ground away in alternating cuts. The researchers also seem to be unaware that in the CATRA test each “cut” is actually both a forward and backward stroke, which would throw a wrench in their theory about the alternating cuts leading to wearing away the pits. In my CATRA articles I wrote about how the total edge retention can actually be predicted by the result of the very first “cut” (two strokes) because the wear of the edge is actually significant enough that even cut number 1 is a significant amount of wear. That difference was seen in their two conditions as well. The roll forging process reduced the thickness of the steel by a large amount, from 2.5 to 1 mm. This effectively reduced the “thickness behind the edge” which was found to affect the wear behavior of CATRA in a previous study:

The researchers made no comment on the thickness effect. The difference could have been eliminated if all of the material had been ground down to the same thickness. I have never seen these “pitted” edges which they claim are from the carbides being removed during the edge retention test. No evidence of significant carbide removal were observed either in the 154CM study I wrote about or in Verhoeven’s CATRA testing [2]. The edge below is after a round of CATRA testing where everything looks pretty smooth, apart from perhaps a few protruding carbides:

Furthermore, if carbides fell out so easily during abrasion it would make metallography of steel very difficult. No special procedure is used to keep the carbides from falling out. You just grind it on sandpaper in progressively finer grits until it is polished. No significant “pits” from large carbides to be seen, even before etching.

They researchers also reported that the roll forging process effectively reduced the amount of carbide present in the steel. We would expect a reduction in carbides to lead to changes in hardness, because as more carbide is dissolved there is more carbon in solution for higher hardness. However, not a single hardness value is reported anywhere in the journal article. They even stated in their introduction that hardness is an important variable and yet they did not measure it.

Results of “Diffusion Annealing” Comparison

The researchers found that the 1180°C high temperature process was effective at improving the edge retention from 218 mm to 378.8 mm, an improvement of 74%. They proposed a similar mechanism to above, stating that the vacillating edge loss curves in CATRA were less extreme with diffusion annealing due to a reduction in primary carbides. They also stated that there was “more martensite in the knives treated by diffusion annealing” which would certainly have affected the hardness. But again, no hardness values were reported.

Effect of Primary Carbides in Previous Studies

In the previously linked study comparing CPM-154 and 154CM, we have a good comparison between different carbide sizes. The CPM process greatly reduces the carbide size relative to the conventional 154CM.

CPM-154 with fine carbides

154CM with large primary carbides

Comparing the wear curves of CPM-154 and 154CM I see no noticeable difference in the size of the fluctuations of the test between the two:

As another comparison we did a CATRA test comparing 154CM and AEB-L. We also had a Damascus condition which was 50:50 AEB-L and 154CM. 154CM as shown above has large primary carbides, a good example of the condition described by the researchers in the 8Cr13MoV above. AEB-L, however, is completely free of primary carbides and has a very fine carbide structure:

However, when comparing the wear behavior of AEB-L and 154CM both show large fluctuations, particularly at the beginning of the test, which would seem to disprove the hypothesis proposed by the 8Cr13MoV researchers. However, the AEB-L knife wears so much more quickly that it is difficult to do a full comparison. For each of the knives it appears that the fluctuations become smaller as the edge reaches about 5 mm per cut, so the AEB-L knife had fewer cuts in the more variable region.

More fundamentally, more carbide, including primary carbides, almost always leads to superior behavior on the CATRA test, as I have written about in the past. I have seen no examples where adding in more carbide made the behavior worse. Therefore I find their explanations for differences in behavior very unconvincing. There are types of cutting where less carbide is better, but the CATRA test is not one of them.

Summary

This article starts out kind of bad, right from the title, since they called it a “sharpness” study when they did not in fact test sharpness. Furthermore, their sharpening procedure is so obscure I cannot replicate it. They performed a range of processing types without measuring the final hardness of any of the knives. There was a difference in thickness between knives that they did not account for. And their proposed mechanism (carbides falling out) is odd and has not been seen in other studies. More carbide helps with CATRA edge retention testing. I commend them for trying but I don’t think these researchers knew enough about what they are doing to perform an effective study. Adding in hardness tests would be a good start for improvement. I emailed them and asked for their hardness results and will report back if they give them to me.

[1] Zhu, Qin-tian, Jing Li, Jie Zhang, and Cheng-bin Shi. “Effect of Primary Carbides on the Sharpness of Kitchen Knives Made of 8Cr13MoV Steel.” Journal of Materials Engineering and Performance 28, no. 8 (2019): 4511-4521.

[2] Verhoeven, John D., Alfred H. Pendray, and Howard F. Clark. “Wear tests of steel knife blades.” Wear 265, no. 7-8 (2008): 1093-1099.

Like this: Like Loading...