LBNL Culture Survey: Methodology Overview
Survey Development Approach
Over the course of April through December 2023, the Lab's Culture Data Scientist engaged in the following activities to develop the LBNL Culture Survey:
Reviewed academic literature for validated survey instruments related to engagement and similar outcomes
Used validated questions (and will conduct further validation)
Checked for face validity via stakeholder meetings
Cross-referenced with existing surveys done by divisions or by UC Berkeley and past surveys
Chose comparable questions where possible
Used similar constructs to prior surveys, with some new types of questions related to learning processes
Held discussions about key priorities with Lab Directorate, ERGs, IDEA Chairs Council, divisional representatives from user facilities, HR Division Partners, and other stakeholder groups
Took best practices from division surveys and tested survey questions on diverse groups in the Lab
Sampling & Weighting Strategy
This culture survey went to all Lab employees, both represented and non-represented. The survey will not consider affiliates/contingent workers this round.
Over 50% of employees took the survey. It is worthwhile to listen to the voice of such a large proportion of employees. The high response rate also makes it likely that the survey captured a wide range of employee feedback.
Even given the high response rate, there is always a risk that those who take the survey are systematically different than those who do not. To help alleviate this risk, we used propensity scores to weigh the results by Area, Employee Class, gender and race/ethnicity.
Survey Validation
We rely on theoretical justification and past research for construct creation. However, we make minor modifications based on survey validation. In particular:
Confirmatory factor analysis (CFA) to test whether the factorial structure was consistent
Inter item correlation within a construct to consider adjusting questions
We did not do quantitative pre-survey validity testing because:
Survey constructs and most items have been validated in prior surveys.
Constructs overlap so we do not expect a validation to show distinct constructs.
We do not want to take out a large subsample of the Lab for a validation
There is strong interest in each specific question
Data Sources
We will merge data collected by the survey with other data sources to understand the relationships between the different cultural constructs and areas of workplace well-being that the survey measures. Only the Culture Data Scientist will have the employee-level information linking survey responses to these data sources, and will immediately delete individual names -- please see the Data Privacy page. All reporting will be aggregated and confidential. These will include:
Retention
Years at the Lab
Seniority
Job type
Area and division
Race/ethnicity
Career or term
Employee class & work group (e.g., administator, scientist, engineer)
Manager status
Gender
Transparency is Essential.
Privacy & confidentiality of your survey data will be handled with the utmost care. Multiple protocols have been put in place to protect your data and ensure that the LBNL Culture Survey is a feedback channel you can trust.
LBNL Culture Survey: Outputs
Overall
Construct scores: Scores by construct are shown using the average responses of the questions in that construct.
Descriptive statistics: Average score of each construct will be shown by the different variables listed in data sources (such as by Area and years at the Lab). We also considered intersectional variables such as gender and job type.
We ran t-tests to check for constructs that were statistically significantly different from overall results apart from the group under consideration. Due to the large number of tests conducting, we applied a Benjamini-Hochberg correction for false discovery rates. We also applied a finite population correction given the large proportion of the Lab population that responded.
OLS regression: A regression helps to clarify which constructs most drive the employee engagement (i.e., an outcome measure in the survey), including the variables listed in the Data Sources section above as covariates.
Benchmarks: We considered available industry benchmarks and prior divisional surveys as points of reference. This survey will serve as a baseline to look at cultural change over time.
Detailed Results
These show:
Question-level averages
Net Promoter Scores
Top-5 Likes, Dislikes, and Motivators
Results are also shown by a variety of groups:
Demographic groups
Employee Class and work group
Seniority, work mode, and tenure
Qualitative Data
Open-ended questions will be analyzed using grounded theory to inductively develop themes that emerge via a close reading of the responses. The responses will then be coded by theme. The themes will be organized according to the constructs so as to integrate with and enrich our understanding of the quantitative results. The themes will also be compared for key subgroups such as by race and gender
We will report on the prevalence of the themes so as to understand how representative they are - the goal is to provide representative responses as well as useful outlier feedback
We will make illustrative quotes by combining several similar quotes to retain confidentiality.