The Global Research Partnerships Alliance is a coalition of Swiss institutions strengthening collaborations with partners around the world to advance sciences for sustainable development.more

Image: NASAmore

AI and global collaborations – case studies on overcoming common challenges

Real projects make the trade-offs of AI in global partnerships tangible. These case studies show where collaborations stumble - and how a similar setup can be organized for safety, equity, and real-world impact. These case studies will help you spot risks early and choose practices that actually work.

A Swiss research institution partnered with a South American team to design a mobile app to be used by nurses to photograph skin lesions for screening. The main pitfalls encountered by this cooperation included:

Governance: Funding, coordination, and decision sat largely with the Swiss research institution while the counterparts in South America changed frequently. The ethics review covered the clinical study, but the AI module’s lifecycle (i.e., data reuse, cross-border transfer, validation, deployment) was not examined in depth.

Users and context: Local caregivers collected images through a simple interface and envisioned an “analyze this image” button. An acceptability study found high trust in caregivers, but was biased because many participants had already used the app. Nurses were treated as data collectors rather than co-creators, and patient perceptions (crucial for adoption) were not integrated into design. AI performance could weaken when applied to images from different phone models, due to variations in camera hardware and software.

Responsible integration: Local caregivers and officials (health ministry teams) were concerned about skill erosion if recommendations were taken as fact. Responsibilities between AI system’s designers and end users were unclear.

Sustainability: Capacity building flowed mostly one way; local constraints (power, devices, connectivity) did not drive technical choices. Post-project options (startup, vendor partnerships) risked shifting costs to patients and creating technological dependencies and lock-in (users are trapped using a specific system).

Want some good practices on how to improve your project? Click here

A research consortium set out to build an AI tool to flag deforestation risks around a protected area. Standard models pointed to “infrastructure expansion” as the main driver. On the ground, that was wrong: community members identified illegal timber harvesting and tourism spillovers from eco-lodges as key pressures - knowledge no satellite or desk study could surface.

Co-design, not consultation: The consortium established a local satellite office and partnered with community leaders, rangers, and civil-society groups as co-designers. These local partners annotated satellite tiles with local notes; interviews documented informal routes and market dynamics. This shifted the model features and labels to reflect lived reality, not imported assumptions.

Governance: Data-sharing and IP are governed jointly; community representatives sit on the steering group. Training runs both ways: a consortium’s researchers learn local context and values; local partners learn data handling and model basics. The team acknowledges and actively counters colonial research habits by centering local priorities and consent at each stage.

How the tool works in practice: The system combines remote-sensing signals with community inputs (field patrol logs, locally reported hotspots, tourism activity) to produce risk scores and explanations. Outputs are probabilistic and action-oriented (“patrol here in the next 72 hours”), with clear uncertainty cues. Human oversight is built-in: rangers review alerts, provide feedback, and the model updates on a scheduled cadence.

Sustainability: Funding covers maintenance, not only the creation phase. The local office hosts operations and develops a lightweight service model so partners can be self-reliant over time (e.g., public support or conservation levies rather than paywalls).

Partners