Three teams, three different solutions, one common goal
Code2College HackTogether was the first-ever Hackathon that pairs technologists with Code2College alumni to create impressive, impactful technical projects.
In just under 24 hours, three teams of software engineers partnered with Code2College alumni attending colleges including Baylor University, Columbia University, Georgetown University, Texas A&M University, and of course, The University of Texas at Austin to develop a technical ESG solution.
Code2College alumni in the top 3 teams earned over $5,000 in college scholarships, while all participants (alumni and professionals) built their skills and networks during this unique experience.
A vast body of research shows that the hiring process is biased and unfair. From unconscious racism to ageism and sexism, biases play a significant role in who gets hired. And despite the business case that has been made for diversity and many of the post-George Floyd commitments that have been made to improve DEI efforts, many US-based companies have still been slow to move the needle on improving hiring practices, specifically. Create a solution that addresses biases that exist in the hiring process.
Project Name: Getting Rid of Bias
Our application is intended to be an internally-facing tool used by the HR team of a company to self-evaluate the biases in their hiring process and measure their improvement over time.
Companies can use the application to capture the progress of individuals involved in the hiring process to address their personal and procedural biases related to hiring candidates.
Project Name: Apollo 18 Resume Review
Our resume review as a service toolkit can upgrade many common mistakes: Bad Grammar, Reading Mistakes, and Text Length.
Of course, a human helping a Code2College student wouldn’t stop there. Humans can make judgments on less rigid criteria as well. One of these judgments is how “professional” they feel the text is.
Unlike the existing tooling repurposed for grammar checking and reading level, we used a custom-trained machine learning model to check for text professionalism. Our research shows that the way you write can inherently reveal dominant, masculine, feminine, age-related undertones. By evaluating the resume, we can inform the applicant before they potentially experience a biased resume review.
Project Name: Jobify
Jobify seeks to create a fairer environment for job searches. For people seeking employment, Jobify provides a fast, simple, and anonymous method to highlight their talents to potential employers.
For companies seeking employees, Jobify delivers an approach for companies to identify matches and review potential recruits quickly.
Voting ended November 26th, 2021.