So, maybe. It largely depends on the county in which the addresses are found.
Most (maybe all?) counties keep public records on property ownership (it's one of the ways Zillow, et al, get their background data). So the specific task seems to be finding that data, then figuring out a way to query it with specific addresses.
Depending on the interface of the county/counties' pages, it could be as simple as downloading a database & using python/excel/whatever to lookup all of your addresses.
If the county has a clunkier interface, where you have to query one address at a time, your task gets a little more difficult. Here are some solutions that could help with that task, in descending order of technical expertise required:
Script-based web-scraping. You can use R, Python, or another language, and any one of various packages to carry this out, but the learning curve can be steep. Scrapy is a good Python package if that's your weapon of choice.
A GUI-based web scraping tool. I've only ever used import.io for this, but there are competitors. How useful they are depends on the exact technology the county's website uses.
Outsourcing to a service, like Amazon's Mechanical Turks. Essentially paying someone else to do it.
Calling the counties in question & seeing if they can help.
You can get pretty far with something like Airtable.com. I can show you how I use it for free. If you are a software user that is more on the technical side, you may not even need someone like me. Like I said, I don't charge for just talking about this stuff - it's actually kind of fun.
Its a custom stack done by me. Its a mix up of JAMStack and SPA (JS Hydration etc.).
The "base" itself is mithril.js (https://mithril.js.org) which in itself is very extendable.
Main target was Onsite Search Performance and Time To First Byte (TTFB). The old website took 15 sec and more on high time with A LOT of monthly server costs. With this system, the server costs where cut down to 1/12.