I do not live in Michigan - only have had family there over the years. A few years back I saw a documentary on the revitalization of downtown Detroit and it was really captivating. Such an awesome American town and so easy to root for. I’m just wondering what you Michiganders think of the Big D and if it is improving at all. Are people and businesses moving back in? What does the future look like for the city? It seems like Detroit is the perfect example of an American city that ‘was’ and possibly can be again. Pretty broad question but any insight would be great.
Last edited by a moderator: