I haven't followed the franchise in a long time. I liked the first 3 movies, the other 3 that came out in the early 2000s were okay but I haven't seen anything since. I see a lot of complaints that the latest movies and shows have gone woke with the exception of that one with Baby Yoda, I can't remember the name of the show. How bad has the franchise slipped into wokeness?