ROOST: Safety Tooling needs Open Tech🐓🤗

Community Article Published February 10, 2025

Today marks the launch of the ROOST organization, standing for Robust Online Open Safety Tools. ROOST is set to fill a long-neglected and critical gap in infrastructure, enabling a more distributed, accountable, and accessible path to safer technology. Hugging Face will serve as a partner; we share a similar approach to similar issues, prioritizing resource sharing, external visibility, and pluralistic engagement to support solutions to problems that concern everyone.

The effectiveness of safety measures relies on the tools we use. Unfortunately, discussions around AI system safety often overlook practical considerations about what specifically can be done, at what costs, and under what technical conditions. This trend results in part from a tendency to treat safety-relevant tools and techniques as a guarded competitive advantage rather than a shared good, which additionally often leads to a compromise in quality when stakeholders lack transparency and informed decision-making power. ROOST's emphasis on open tools is a crucial step towards rectifying these issues by fostering collaboration and accessibility.

This openness is particularly relevant to the distributed and collaborative developer ecosystem that Hugging Face supports. Open and collaborative development of AI has distinct advantages over isolated and private development by single entities -- it enables interoperability, plays an indispensable role in enabling research to support informed governance, and makes technology development overall more efficient and aligned with the needs of millions of diverse use cases and contexts. But it also faces barriers in meeting its full potential as a safety-promoting approach. Small-scale developers working on specific projects within the broader development chain often lack the resources to create new safety tools from scratch, face disproportionate challenges in adopting off-the-shelf solutions that might not suit their specific needs, and are sometimes even denied access to those tools altogether. By developing open and accessible alternatives that members of the Hugging Face community can leverage when using the platform to curate and leverage datasets, to train, fine-tune, and integrate models, and to develop user-facing apps, ROOST can help minimize or even entirely eliminate those barriers. Further, by featuring tools tailored for every single stage of the development process rather than trying to make AI systems safer primarily at the deployment layer, it can foster much more robust systems overall.

Open safety tools also support more inclusive safety interventions by larger and better-resourced developers. Development choices, especially in the context of safety and security, always involve trade-offs between different values and interests. When large developers whose expertise is primarily technical and driven by commercial product requirements are made entirely responsible for managing those trade-offs, we’ve seen time and again that the needs of the more marginalized stakeholders are consistently deprioritised. Mitigating those issues requires developers to expose their choices to external scrutiny and possibly criticism from those stakeholders. While this can seem a daunting prospect for individual actors, embracing more open and collaborative development helps ensure that these necessary interactions focus on industry practices rather than particular choices and that measures taken to address limitations of common approaches better propagate through the entire industry.

Transparent and accessible safety tools are essential. They empower regulators and policymakers to make well-informed choices, ensure that safety can be prioritized along the full development cycle by all categories of developers, and support critical research and participation. The ROOST organization is particularly well positioned to enable this work, and to make safety interventions for AI more robust and accountable in collaboration with the Hugging Face community.

Community

Sign up or log in to comment