Both books outline how consumer scoring works. Data brokers amass dossiers with thousands of details about individual consumers, like age, religion, ethnicity, profession, mortgage size, social networks, estimated income and health concerns such as impotence and irritable bowel syndrome. Then analytics engines can compare patterns in those variables against computer forecasting models. Algorithms are used to assign consumers scores — and to recommend offering, or withholding, particular products, services or fees — based on predictions about their behavior.

But while both books emphasize the notion that consumer reputations are vulnerable to such covert scoring apparatuses, the authors differ markedly in the steps they say ordinary people might take to protect themselves.

Befitting the founder of a firm that markets reputation management, Mr. Fertik contends that individuals have some power to influence commercial scoring systems. He presents nascent technologies, like online education courses that can score people on the specific practical skills or concepts they have mastered, as democratizing forces that could enable workers to better compete for jobs on merit. His book suggests that readers curate, or hack, their digital reputations — for instance, by emphasizing certain keywords on their résumés to position them better for predictive scoring engines, or by posting positive reviews of restaurants or hotels online, in the hope that algorithms will flag them for future V.I.P. treatment.

“Employers’ algorithms will pick your résumé out of the pile of thousands just as instantaneously and robotically as they pass over others,” he and his co-author write. “Banks and lenders will automatically approve you for the better rates and offers. The more appealing dates on apps and sites like Tinder, Match and OkCupid will see your profile before they see any others.”

Think of this technique as reputation engine optimization. If an algorithm incorrectly pegs you as physically unfit, for instance, the book suggests that you can try to mitigate the wrong. You can buy a Fitbit fitness tracker, for instance, and upload the exercise data to a public profile — or even “snap that Fitbit to your dog” and “you’ll quickly be the fittest person in your town.”

Professor Pasquale offers a more downbeat reading. Companies, he says, are using such a wide variety of numerical rating systems that it would be impossible for average people to significantly influence their scores.

“Corporations depend on automated judgments that may be wrong, biased or destructive,” Professor Pasquale writes. “Faulty data, invalid assumptions and defective models can’t be corrected when they are hidden.”