Data Management and Other Fundamentals for Efficient and Reproducible Research
Issues of reproducibility and replication increasingly dominate conversations across the behavioral, data, health, and social sciences. Good workflow practices ensure that all stages of the research process—from reviewing and saving relevant literature, sharing files among collaborators, using software effectively, and conducting data management and analysis tasks—can be done effectively, efficiently, and accurately. The research process contains many steps and—often—thousands of files. Mistakes at any stage of the process—be it the planning, documenting, data collection, data cleaning, variable creation and manipulation, data analysis, writing, replicating, and preserving stages—can waste time and cause unnecessary stress. This workshop will provide practical and easily implementable best practices for anyone who deals with quantitative data in their research—ensuring their workflow leads to accurate and reproducible research results. The methods discussed in the workshop are general and not tied to any particular software package for data analysis.