If you're really interested in this topic I strongly recommend the book Lies My Teacher Told Me which covers a lot of things that are taught in High Schools that isn't actually true.
Pretty much everything high school kids are taught about pre-1776 America is a myth. Columbus did not uniquely think the world was round (he incorrectly thought it was small), he didn't keep two sets of records to keep his suspicious crew from knowing how far they'd sailed, and the "Just give me three more days" story is complete fiction. The Mayflower was never meant to land in modern Massachusetts and may have been hijacked by the pilgrims, who were a minority of passengers. Most Americans aren't even aware of the Philippine War or how much the Spanish-American War was built on lies.
Just generally, American textbooks are built to "instill patriotism", which is often "instill support for the American government".
Education is not centralized, each state decides for themselves. Some states prioritize education and fund it and make sure the textbooks are high quality. Other states do not prioritize education so it's not as important for them to fund the schools to make them the best they can be. And then there's a HUGE role of politics and religion in education, with some states having ideologues deciding what is taught. Texas in particular is bad at this, with this one far right dentist dude holding great power for many years on the Texas board that determines textbook content for the entire state. Because Texas is the second most populous state, it's economical for textbook companies to target their textbooks to Texas, which has skewed textbooks all over the country. They weaken the teaching of evolution to conflict less with the Bible, and they politicize the teaching of history to emphasize far right wing versions of history. This impacts education nationwide, but particularly the states where the politicians like this kind of thing. They don't want to teach critical thinking skills because they don't want kids to question what they're being taught. So states wth less of the politicization have better education, while more politicized state, where the school board don't' want students to learn a bunch of things, will have worse education. If you want to know more, read Lies My Teacher Told Me.
Racism also plays a role. For a long time, many states segregated students--different schools for white and black kids. The schools for white kids were much better funded than the schools for black kids. Then in the 1950s to 1960s, there was a huge movement to desegregate schools. This led to many white parents pulling their kids out of the public schools as they didn't want their kids to be educated alongside black kids. But where could those white kids go? If they went to private schools, that would cost more money. So school boards arranged things so that there would be sort of semi-private schools, like charter schools, that could draw from public school funds yet exclude minority kids. This took funds away from public schools, impacting their quality.
Almost EVERYTHING.
Seriously. It is nearly impossible to list how many ABSOLUTE LIES I was told as a child.
Growing up in the United States is nearly 100% pure propaganda.
The sheer level of denial is so vast I literally cannot full talk about it. Growing up in the Deep South in the 1960's is now, to me, an almost entirely different universe to what I now know.
And not just Elementary School, either: https://www.amazon.com/Lies-My-Teacher-Told-Everything-ebook/dp/B07DRP1GZ2/ref=tmm_kin_swatch_0?_encoding=UTF8&qid=&sr=
Lies My Teacher Told Me covers this subject very nicely.
Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong https://www.amazon.com/dp/B07DRP1GZ2/ref=cm_sw_r_cp_apa_i_iaAzCbBSNS1XH