This is called a two-way interaction. This opens the “Linear Regression:Statistics” box as shown in Figure 4. Imagine that the average IQ score is 100. Consequently, they only show the unique effect of one interacting variable when the value of the other is at zero. You center the continuous variable by subtracting the mean score from each data-point. Figure 6 shows a roughly normal distribution, with a peak at the lowest values. The coefficient is statistically significant, based on a p-value of less than .001. We therefore have little concern about multicollinearity influencing this regression analysis. My dependent variable is house prices. Typically this means your mean score should be entered to say at least 4 decimal places (though the number of decimal places needed will depend on your data). In SPSS, click on "linear regression" and enter the test score variable as the DV. As before, the intercept (3.88) can be interpreted as the average level of the dependent variable (satisfaction with the economy) when the values of the independent variables are at zero (in this case for women who did not vote). Figure 7 shows a very slightly negatively skewed distribution, with a peak of values just above the mean but is close enough to normal not to warrant any concern. Click "next" and enter the same two variables AND the new interaction variable as the IVs. Focus is given instead to the difference in slopes which is described by the interaction coefficient. An interaction can occur between independent variables that are categorical or continuous and across multiple independent variables. Specially in APA format? These 3 predictors are all present in muscle-percent-males-interaction.sav, part of which is shown below. For this example, just check “Histogram” under the Descriptive heading. Our random effects were week (for the 8-week study) and participant. (Note that if you want to compute predicted values from the main effects model as well as the interaction model, you need to select this option and run the analysis for the first model prior to selecting the variables for the second model. This opens another dialog box where you can select the plots you want to produce. An interaction is depicted as a significant value for the interaction variable. Results shows a value of 5.824 and associated p-value of .016. Figure 10 reports a coefficient of −.069 for the variable conformity and .236 for voter. In the output, look at the second model in the "Coefficients" box. Is standardized coefficients enough to explain the effect size or Beta coefficient or will I have to consider unstandarized as well? You could skip this step and just enter the full path into the next step. When using the software to test the interaction between a categorical and continuous variable, you should center the continuous variable first in SPSS before using the Interaction! This provides estimates for both models and a significance test of the difference between the R-squared values. Then, use the "Compute" command in SPSS to create a new variable that is the original values minus the mean. Click "next" and enter both centered variables AND the new interaction variable as the IVs. That is, I want to know the strength of relationship that existed. In the “Define Multiple Line: Summaries for Groups of Cases” dialog box that opens, shown in Figure 12, check “Other statistic (e.g., mean)”, highlight the new variable, Unstandardized Predicted Values [PRE_1] and click the arrow to move it to the “Variable” text box. sensibly be modelled linearly, but the interaction couldn’t. The relationship may be positive for small values of pred1, but becomes negative for larger values (from left to right). © 2008-2020 ResearchGate GmbH. I am very new to mixed models analyses, and I would appreciate some guidance.