GOLDEN VALLEY, Minn. - Have you connected with your neighbors on Nextdoor?

The social site's mission is to create stronger, safer neighborhoods through the power of technology. But the company started to notice something happening...unintentional racial profiling.

"We call that unconscious bias," said Hamline Adjunct professor David Edgerton. "It's an automatic gravitation to where I'm comfortable, but being comfortable with a situation comes from the amount of exposure you have. So if you have a limited exposure then you're going to be biased within that space of what you're exposed to."

While it's not a new issue...a tech company stepping up to try to stop it...is. Here's how it will work.

When someone wants to post about a crime or suspicious activity, and they choose to use race as a descriptor, they will also be asked to add at least two other physical descriptors.

"You're not saying, 'Don't say it's an African-American person that I just saw breaking into my car,' what you're saying is, 'say it's an African American person that has this type of clothing on who has these kinds of shoes on with this color," Edgerton said. "By adding more descriptors or adding more information to in this case a crime that's occurring, you're saying 'I don't care that they're in a certain group, I'm giving you all the information to describe this person that's com this crime and I'm okay with that and I think there needs to be more of that.'"

Nextdoor hopes the move will encourage people to think before they post. Sometimes all it takes is one person, one company, to make a bold change, to inspire change in others.

"I like to see more of it happen. I'd like to see what Facebook and Twitter does in response to seeing these kinds of social media companies make these kinds of changes," Edgerton concluded.