Design-driven research (DBR) is a methodology participatory approach aimed at developing and testing innovative solutions in real-world contexts while generating theoretical insights. Its goal is to solve complex, practical problems through iterative cycles of design, implementation, and evaluation, often in collaboration with stakeholders such as practitioners and end users. For example, DBR might involve designing and refining an educational technology tool to improve student engagement or developing a workplace intervention to improve productivity.

recherche orientée par la conception

Methods and methodologies

Prototyping – Prototyping involves creating initial versions of a solution to test its feasibility, usability, and effectiveness in real-world settings. For example, in a DBR educational project, researchers might develop a prototype of an interactive learning module and test it in classrooms.

Methodology :

Researchers design the prototype based on existing theoretical frameworks and stakeholder input. Initial testing takes place in the intended setting (e.g., schools, workplaces), and data is collected through observations, user feedback, and performance measurements. Feedback is used to iteratively refine the prototype, address issues, and improve its functionality. This iterative process ensures that the prototype evolves to effectively meet real-world needs.

Iterative evaluation – Iterative evaluation systematically assesses the effectiveness of a solution over multiple design and implementation cycles. For example, a study on improving medication adherence might iteratively evaluate a mobile health app, incorporating user feedback and behavioral data.

Methodology :

Each iteration involves implementing the solution, collecting data (e.g., surveys, interviews, performance metrics), and analyzing the results to identify strengths, weaknesses, and areas for improvement. Adjustments are made to the design based on the results, and subsequent cycles aim to optimize the usability and impact of the solution. Approaches to methods Mixed data often combines qualitative information (e.g., user feedback) with quantitative measures (e.g., usage data).

Intervention studies – Intervention studies evaluate the impact of the solution on real-world practices and outcomes. For example, an intervention study might examine how a new instructional strategy influences student learning outcomes in various classroom settings.

Methodology :

The intervention is implemented in collaboration with practitioners to align with real-world needs and constraints. Data collection methods include pre- and post-intervention assessments, observations, and stakeholder interviews. Analysis focuses on measuring the effectiveness of the solution and understanding the factors influencing its success. The knowledge gained is used to refine both the solution and the associated theoretical framework.

Good practices

Engage stakeholders early:
Collaborate with practitioners and end users from the start to ensure the solution meets real-world needs and constraints.

Iterate continuously:
Use iterative cycles of design, implementation, and evaluation to refine the solution based on empirical evidence and stakeholder feedback.

Integrating theory and practice:
Base the design on existing theories while allowing the research process to generate new theoretical insights.

Document each iteration:
Maintain detailed records of design decisions, implementation processes, and evaluation results to improve transparency and reproducibility.

Adaptation to the context:
Adapt the solution and assessment strategies to the specific needs and constraints of the context to ensure relevance and effectiveness.

Ensure scalability:
Design solutions with scalability and adaptability in mind, so they can be applied in diverse contexts and populations.

What to avoid

Insufficient tests:
Rushing into implementing a solution without proper prototyping and iterative evaluation can result in impractical or ineffective results.

Ignoring stakeholder contributions:
Excluding practitioners or end users from the design process can lead to solutions that fail to effectively address real-world challenges.

Neglecting contextual factors:
Failure to consider the unique constraints and opportunities of the context limits the applicability and success of the solution.

Lack of documentation:
Failing to document the iterative process reduces the transparency, reproducibility, and credibility of the research.

Premature generalization:
Drawing conclusions about the effectiveness of the solution before testing it in various contexts undermines its credibility and validity.

Neglecting the development of theory:
Focusing solely on practical results without contributing to theoretical understanding limits the broader impact of research.

Conclusion

Design-driven research is a powerful and dynamic approach to solving real-world problems through the iterative development and refinement of innovative solutions. Using methods such as prototyping, iterative evaluation, and intervention studies, DBR bridges the gap between theory and practice. Adherence to best practices, such as stakeholder engagement, process documentation, and ensuring scalability, ensures that DBR produces effective, adaptable, and theory-based solutions. This methodology is invaluable in addressing complex challenges across disciplines, making significant contributions to practical applications and theoretical advances.

en_USEN