Filter results

47668 results

The Academic Fringe Festival - Aaron Halfaker: Designing to Learn - Aligning Design Thinking and Data Science to Build Intelligent Tools That Evolve

The Academic Fringe Festival - Aaron Halfaker: Designing to Learn - Aligning Design Thinking and Data Science to Build Intelligent Tools That Evolve 04 April 2022 17:00 till 18:00 - Location: Online by Aaron Halfaker | Microsoft Research Abstract “Design to learn" is a collaborative approach to developing intelligent systems that leverage the complementary capabilities of designers and data scientists. Data scientists develop algorithms that work despite the noisy, messy realities of human behavior patterns, and designers develop techniques that reduce noise by aligning interactions closely with how users think about their work. In this talk, I'll describe a set of shared concepts and processes that are intended to help designers and data scientists communicate effectively throughout the development process. This approach is being applied and refined within various product contexts in Microsoft including email triage, meeting recap, time management, and Q&A routing. Speaker Biography Aaron Halfaker is a principal applied research scientist working in the Office of Applied Research in Microsoft’s Experiences and Devices organization. He is also a Senior Scientist at the University of Minnesota. Dr. Halfaker’s research explores the intersection of productive information work and the application of advanced technologies (AI) to support productivity. In his systems building research, he’s worn many hats from full stack engineer, ethnographer, engineering manager, UX designer, community manager, and research scientist. He’s most notable for building an open infrastructure for machine learning in Wikipedia called ORES. His research and systems engineering have been features in the tech media including Wired, MIT Tech Review, BBC Technology, The Register, and Netzpolitik among others. Dr. Halfaker reviews and coordinates for top-tier journals in the social computing and human center-AI space including ACM CHI, ACM GROUP, ACM CSCW, Transactions on Social Computing, WWW, and JASIST. Homepage: https://www.microsoft.com/en-us/research/people/ahalfaker/ . More information In this second edition on the topic of "Responsible Use of Data", we take a multi-disciplinary view and explore further lessons learned from success stories and examples in which the irresponsible use of data can create and foster inequality and inequity, perpetuate bias and prejudice, or produce unlawful or unethical outcomes. Our aim is to discuss and draw certain guidelines to make the use of data a responsible practice. Join us To receive announcements of upcoming presentations and events organized by TAFF and get the Zoom link to join the presentations, join our mailing list . TAFF-WIS Delft Visit the website of The Academic Fringe Festival

The Academic Fringe Festival - Nithya Sambasivan: The Myopia of Model Centrism

The Academic Fringe Festival - Nithya Sambasivan: The Myopia of Model Centrism 11 April 2022 17:00 till 18:00 - Location: Online by Nithya Sambasivan Abstract AI models seek to intervene in increasingly higher stakes domains, such as cancer detection and microloan allocation. What is the view of the world that guides AI development in high risk areas, and how does this view regard the complexity of the real world? In this talk, I will present results from my multi-year inquiry into how fundamentals of AI systems---data, expertise, and fairness---are viewed in AI development. I pay particular attention to developer practices in AI systems intended for low-resource communities, especially in the Global South, where people are enrolled as labourers or untapped DAUs. Despite the inordinate role played by these fundamentals on model outcomes, data work is under-valued; domain experts are reduced to data-entry operators; and fairness and accountability assumptions do not scale past the West. Instead, model development is glamourised, and model performance is viewed as the indicator of success. The overt emphasis on models, at the cost of ignoring these fundamentals, leads to brittle and reductive interventions that ultimately displace functional and complex real-world systems in low-resource contexts. I put forth practical implications for AI research and practice to shift away from model centrism to enabling human ecosystems; in effect, building safer and more robust systems for all. Speaker Biography Dr. Nithya Sambasivan is a sociotechnical researcher whose work is in solving hard, socially-important design problems impacting marginalised communities in the Global South. Her current research re-imagines AI fundamentals to work for low-resource communities. Dr. Sambasivan's work has been widely covered in venues like VentureBeat, ZDnet, Scroll.in, O’Reilly, New Scientist, State of AI report, HackerNews and more, while influencing public policy like the Indian government’s strategy for responsible AI and motivating the NeurIPS Datasets track. As a former Staff Research Scientist at Google Research, she pioneered several original, award-winning research initiatives such as responsible AI in the Global South, human-data interaction, gender equity online, and next billion users, which fundamentally shaped the company’s strategy for emerging markets, besides landing as new products affecting millions of users including in Google Station, Search, YouTube, Android, Maps & more. Dr. Sambasivan founded and managed a blueprint HCI team in Google Research Bangalore, and set up the Accra HCI team, in contexts with limited existing HCI pipelines. Simultaneously, her research has received several best paper awards at top-tier computing conferences. Homepage: https://nithyasambasivan.com/ . More information In this second edition on the topic of "Responsible Use of Data", we take a multi-disciplinary view and explore further lessons learned from success stories and examples in which the irresponsible use of data can create and foster inequality and inequity, perpetuate bias and prejudice, or produce unlawful or unethical outcomes. Our aim is to discuss and draw certain guidelines to make the use of data a responsible practice. Join us To receive announcements of upcoming presentations and events organized by TAFF and get the Zoom link to join the presentations, join our mailing list . TAFF-WIS Delft Visit the website of The Academic Fringe Festival

Half Height Horizontal

How system safety can make Machine Learning systems safer in the public sector

Machine Learning (ML), a form of AI where patterns are discovered in large amounts of data, can be very useful. It is increasingly used, for example, in chatbot Chat GPT, facial recognition, or speech software. However, there are also concerns about the use of ML systems in the public sector. How do you prevent the system from, for example, discriminating or making large-scale mistakes with negative effects on citizens? Scientists at TU Delft, including Jeroen Delfos, investigated how lessons from system safety can contribute to making ML systems safer in the public sector. “Policymakers are busy devising measures to counter the negative effects of ML. Our research shows that they can rely much more on existing concepts and theories that have already proven their value in other sectors,” says Jeroen Delfos. Jeroen Delfos Learning from other sectors In their research, the scientists used concepts from system safety and systems theory to describe the challenges of using ML systems in the public sector. Delfos: “Concepts and tools from the system safety literature are already widely used to support safety in sectors such as aviation, for example by analysing accidents with system safety methods. However, this is not yet common practice in the field of AI and ML. By applying a system-theoretical perspective, we view safety not only as a result of how the technology works, but as the result of a complex set of technical, social, and organisational factors.” The researchers interviewed professionals from the public sector to see which factors are recognized and which are still underexposed. Bias There is room for improvement to make ML systems in the public sector safer. For example, bias in data is still often seen as a technical problem, while the origin of that bias may lie far outside the technical system. Delfos: “Consider, for instance, the registration of crime. In neighbourhoods where the police patrol more frequently, logically, more crime is recorded, which leads to these areas being overrepresented in crime statistics. An ML system trained to discover patterns in these statistics will replicate or even reinforce this bias. However, the problem lies in the method of recording, not in the ML system itself.” Reducing risks According to the researchers, policymakers and civil servants involved in the development of ML systems would do well to incorporate system safety concepts. For example, it is advisable to identify in advance what kinds of accidents one wants to prevent when designing an ML system. Another lesson from system safety, for instance in aviation, is that systems tend to become more risky over time in practice, because safety becomes subordinate to efficiency as long as no accidents occur. “It is therefore important that safety remains a recurring topic in evaluations and that safety requirements are enforced,” says Delfos. Read the research paper .

Boosting sustainable building education

Boosting sustainable building education in The Netherlands On 17 September, TU Delft launched a new initiative to implement sustainable building practices across the Dutch educational landscape by bringing together educators from Dutch vocational institutions (MBO) and TU Delft lecturers. Last week, the kick-off event at The Green Village on the TU Delft campus brought together 10 educators from MBO institutions and lecturers from TU Delft’s Sustainable Building with Timber MOOC. Educating for impact From September to December 2024, the MBO educators will participate in the MOOC as students: watching videos, completing course exercises, and submitting assignments. Additionally, they will engage in online sessions guided by TU Delft lecturers, who provide subject matter expertise, and an educational expert supporting the online learning process. From December through June 2025, the focus will shift to creating adaptable and open teaching resources specifically developed for MBO institutions. A ripple effect By equipping teachers with the tools and knowledge to teach sustainable building, the initiative supports the transition to more environmentally responsible practices within the building industry. The knowledge shared through this programme, made possible in collaboration with Leren voor Morgen and the MBO Raad (Council for schools of vocational education and training ), will shape the future workforce and contribute to a more sustainable world. While initially targeting a limited number of MBO institutions, the initiative’s impact is expected to extend far beyond. As educators integrate the materials into their curricula, the knowledge will reach future generations of students, amplifying the long-term influence of the project. Sustainable Building with Timber MOOC Course information A two-way learning process This mutually beneficial project embodies lifelong learning. MBO teachers gain access to cutting-edge teaching materials on building with timber, while TU Delft benefits from the practical insights these practitioners bring from the field. This knowledge exchange enhances vocational education and strengthens TU Delft’s research and teaching. Open resources for lasting impact A key goal of the project is to create open-access, customisable teaching materials, enabling educators to tailor content to meet the specific needs of their institutions and students. This flexible approach fosters the teaching of sustainable building techniques. Acknowledgements Heartfelt thanks to everyone involved in making this initiative possible. Together, we are laying the foundations for a more sustainable future.

Three Students Nominated for the ECHO award

Three TU Delft students have been nominated for the ECHO Award 2024. The ECHO award is awarded to students with a non-western background who are actively engaged in society. Sibel, TJ and Pravesha talk about their background their nomination. The finalists will be selected on September 27th. Sibel Gökbekir How has your background influenced your academic journey? As a woman with Turkish roots, my academic journey has been about more than just pursuing degrees in engineering and law; it’s been about consistently advocating for the diverse needs of women and multicultural groups, ensuring their voices are heard in important decisions. This is why I actively contributed to different board positions at TU Delft, working to promote inclusivity and equality. My background inspired me to explore how engineering, law, and social justice intersect, particularly in empowering marginalised communities. I chose to study energy transitions and human rights to contribute to a fairer, more inclusive World. How have you turned this into contributions to society? I’ve dedicated my academic and personal life to promoting diversity and inclusion. As a youth ambassador for Stop Street Harassment, I aimed to create safer spaces for women and minorities because I believe everyone has the right to feel free and safe in society. Through the Turkish Golden Tulip Foundation, I advocated for vulnerable communities in earthquake relief. Additionally, I founded an initiative for migrant students in Rotterdam-South and I have been committed to improving educational opportunities for secondary school students with a migration background. Next, I gave guest lectures across the Netherlands to educate the younger generation about climate change and equitable energy transitions, emphasising the importance of a fair transition for all communities. What does it mean for you to nominated to the ECHO award? I feel very honoured to have been nominated on behalf of TU Delft. My commitment to community engagement is part of who I am, and therefore the ECHO Award is more than just a recognition; It offers me an opportunity to further expand my contributions to a more inclusive society. As an ECHO Ambassador, I plan to expand my efforts in promoting equality and sustainability, while inspiring others to take action for a more equitable World. TJ Rivera How has your background influenced your academic journey? My background as a Filipino in a Dutch-speaking bachelor’s programme made my academic journey both challenging and enriching. Being gay in a male-dominated field like Architecture, where most role models were heteronormative men, added another layer of difficulty. It was intimidating to not see people like me represented. However, this experience fuelled my belief that systems can and should be challenged, changed, and updated. I aimed to bring a fresh perspective, advocating for greater diversity and inclusivity in the field. How have you turned this into contributions to society? I translated my personal challenges into tangible contributions by advocating for inclusivity within architecture. Together with like-minded individuals, I began exploring the intersection of identity, sexuality, and architecture, and collaborated with my faculty’s diversity team to raise awareness. As I became known for my work with the queer community, I saw an opportunity to create lasting change. I co-revived ARGUS, the once-inactive study association for the Master of Architecture, which now serves as a platform to discuss and address issues of diversity within the field. This initiative continues to foster a more inclusive academic environment. What does it mean for you to be nominated to the Echo award? Being nominated for the ECHO Award is a significant milestone in my journey to expand my mission beyond the confines of my faculty. This national platform provides the opportunity to raise awareness and advocate for social justice on a larger scale. I believe students are key to driving change, and my focus is on amplifying the voices of the queer community, which is often overlooked. The ECHO Award will enable me to form partnerships with organizations and universities, further promoting diversity, inclusivity, and equality. It’s a chance to create broader, tangible change, addressing the needs of those who often go unheard. Pravesha Ramsundersingh How has your background influenced your academic journey? As a woman in STEM (Science, Technology, Engineering, and Mathematics), my background has been a powerful motivator to challenge gender disparities within Computer Science. Experiencing firsthand the underrepresentation of women in this field, I have been driven to not only excel academically but also become an advocate for diversity. Through leadership roles in the Faculty and Central Student Councils, I’ve focused on creating an inclusive environment that supports women and minority students, ensuring that everyone has the opportunity to succeed. How have you turned this into contributions to society? I’ve translated my experiences into actionable contributions by actively advocating for DEI at TU Delft. I ensured sexual education and consent training for 3,000 freshmen students, and I led initiatives like the Social Safety Initiatives Conference alongside the Dutch National Coordinator against Racism and Discrimination. In my student governance roles, I pushed for policies that address gender discrimination and social safety concerns, creating a more supportive environment for students of all backgrounds to thrive in both academic and social spaces. What does it mean for you to nominated to the ECHO award? Being nominated for the ECHO Award is an incredible honour that highlights the importance of the work I have done to promote DEI. It inspires me to continue advocating for systemic change in the tech industry and academia. This nomination reaffirms my commitment to driving equity in STEM, ensuring that future generations have more inclusive opportunities. It also motivates me to keep pushing boundaries and empower others to take action for a more just and equal society. The ECHO Award Every year ECHO, Center for Diversity Policy, invites colleges and universities to nominate socially active students who make a difference in the field of Diversity & Inclusion for the ECHO Award 2024. The ECHO Award calls attention to the specific experiences that students with a non-Western background* carry with them and the way they manage to turn these experiences into a constructive contribution to society. Winners are selected by an independent jury and may attend a full-service Summercourse at UCLA in the United States in 2025. Read more: ECHO Award - ECHO (echo-net.nl)