Two-Day Hands-On Workshop on WarpPLS: SEM Fundamentals with Linear and Nonlinear Applications

*** Registration and additional details ***

http://bit.ly/oqoG5C

or

http://scriptwarp.com/warppls/prjs/2012_WarpPLSwkshp_Jan_SanAntonio

*** Instructor ***

Ned Kock, Ph.D.

WarpPLS Developer

http://nedkock.com

*** Location and dates ***

Our Lady of the Lake University

San Antonio, Texas

6-7 January 2012 (Fri-Sat), 8 am–5 pm

*** Workshop program at a glance ***

The main goal of this workshop is to give participants a practical understanding of how to use the software WarpPLS to conduct variance-based structural equation modeling (SEM). The workshop is very hands-on and covers linear and nonlinear applications.

Day 1 of workshop

• Overview of workshop and formation of teams

• Overview of web resources: Video clips, blog, publications, spreadsheets, and templates

• Overview of steps 1 to 5 of a complete SEM analysis

• Hands-on exercise: Steps 1 to 5 of a complete SEM analysis

• Resampling as shuffling multiple decks of cards

• Choosing the right resampling method

• Hands-on exercise: Changing the resampling method

• Choosing the right warping (i.e., nonlinear) algorithm

• Viewing plots of linear and nonlinear relationships

• Hands-on exercise: Changing the warping algorithm and viewing plots

• Charting non-standardized data

• Hands-on exercise: Charting non-standardized data

• Reading discussion: Kock (2011) – WarpPLS 2.0 User Manual

Day 2 of workshop

• Testing a mediating effect using the Baron & Kenny approach

• Hands-on exercise: Testing a mediating effect using the Baron & Kenny approach

• Testing a mediating effect using the Preacher & Hayes approach

• Hands-on exercise: Testing a mediating effect using the Preacher & Hayes approach

• Reading discussion: Kock et al. (2009) – Communication flow orientation article

• Testing a moderating effect

• Hands-on exercise: Testing a moderating effect

• Adding control variables into an analysis

• Conducting a multi-group analysis

• Conducting a full collinearity test

• Reading discussion: Zhang et al. (2010) – Organizing software testing article

• Hands-on exercise: Team project using participant’s own data

• Presentation of results from team project

## Monday, October 10, 2011

## Monday, August 29, 2011

### Using WarpPLS in E-Collaboration Studies: Mediating Effects, Control and Second Order Variables, and Algorithm Choices

A new article discussing WarpPLS is available. The article is titled “Using WarpPLS in E-Collaboration Studies: Mediating Effects, Control and Second Order Variables, and Algorithm Choices”. It has been recently published in the International Journal of e-Collaboration. A full text version of the article is available here as a PDF file. Below is the abstract of the article.

*This is a follow-up on two previous articles on WarpPLS and e-collaboration. The first discussed the five main steps through which a variance-based nonlinear structural equation modeling analysis could be conducted with the software WarpPLS (Kock, 2010b). The second covered specific features related to grouped descriptive statistics, viewing and changing analysis algorithm and resampling settings, and viewing and saving various results (Kock, 2011). This and the previous articles use data from the same e-collaboration study as a basis for the discussion of important WarpPLS features. Unlike the previous articles, the focus here is on a brief discussion of more advanced issues, such as: testing the significance of mediating effects, including control variables in an analysis, using second order latent variables, choosing the right warping algorithm, and using bootstrapping and jackknifing in combination.*## Monday, July 18, 2011

### WarpPLS workshop at Fundação Getúlio Vargas in June 2011: Details and some photos

Below are some photos from the June 2011 WarpPLS workshop at Fundação Getúlio Vargas, one of the highest ranked and most prestigious universities in Brazil in its main areas of focus. FGV’s main foci are business management and public administration. The workshop was in Rio de Janeiro. The workshop participants included faculty, doctoral students, and masters’ students at FGV.

This was a very hands-on workshop, as the participants had taken a course in structural equation modeling prior to it. They used Amos in that course, which was great because the workshop then highlighted the power of WarpPLS vis-à-vis a well established and also very useful tool for multivariate analyses with latent variables (Amos). We had about 15 contact hours for this workshop. Activities included commentaries based on video clips, live demonstrations, discussions of selected readings, and practical assignments focusing on linear and nonlinear empirical data analyses.

About 30 percent of the workshop was set aside for “free data analyses”, building on data that the participants brought into the workshop. That is, the participants had time to analyze their own data, and solve specific problems with my help. (There are always issues that are specific to a given dataset; e.g., problems with indicator loadings and interpretation of nonlinear results.) There was also a team workshop project, where participant teams presented an independent empirical study with analyses employing WarpPLS.

Some of the participants were faculty members from other universities in Rio de Janeiro, as well as employees of a few major research and training organizations in Brazil. Among these organizations were Fundação Oswaldo Cruz (a.k.a. FIOCRUZ), and the Escola de Comando e Estado Maior do Exército (ECEME). FIOCRUZ is one of the world’s foremost public health organizations, known for its strengths in various related areas, including epidemiological research. ECEME is an education institution that prepares officers of the Brazilian Army to take up command positions at the rank of General.

This was a very hands-on workshop, as the participants had taken a course in structural equation modeling prior to it. They used Amos in that course, which was great because the workshop then highlighted the power of WarpPLS vis-à-vis a well established and also very useful tool for multivariate analyses with latent variables (Amos). We had about 15 contact hours for this workshop. Activities included commentaries based on video clips, live demonstrations, discussions of selected readings, and practical assignments focusing on linear and nonlinear empirical data analyses.

About 30 percent of the workshop was set aside for “free data analyses”, building on data that the participants brought into the workshop. That is, the participants had time to analyze their own data, and solve specific problems with my help. (There are always issues that are specific to a given dataset; e.g., problems with indicator loadings and interpretation of nonlinear results.) There was also a team workshop project, where participant teams presented an independent empirical study with analyses employing WarpPLS.

Some of the participants were faculty members from other universities in Rio de Janeiro, as well as employees of a few major research and training organizations in Brazil. Among these organizations were Fundação Oswaldo Cruz (a.k.a. FIOCRUZ), and the Escola de Comando e Estado Maior do Exército (ECEME). FIOCRUZ is one of the world’s foremost public health organizations, known for its strengths in various related areas, including epidemiological research. ECEME is an education institution that prepares officers of the Brazilian Army to take up command positions at the rank of General.

## Tuesday, June 28, 2011

### WarpPLS’ treatment of formative latent variables: PLS regression is more conservative and stable

WarpPLS uses what is often referred to as Wold’s original “PLS regression” algorithm to calculate indicator weights, for both formative and reflective variables. PLS regression was developed by Wold, and is slightly different from the modified versions often referred to as modes A and B, which are the ones normally used in other publicly available PLS-based structural equation modeling software. These modified versions implement an underlying algorithmic assumption that Lohmöller called the "good neighbor" assumption, whereby weights are influenced by inner model links.

Generally speaking, the PLS regression algorithm generates coefficients that are more stable and robust – i.e., reliable for hypothesis testing. It also tends to minimize collinearity. On the other hand, it may be lead to a higher demand for computational power in some cases, which may be the reason why modified versions have been implemented. Lohmöller discusses multiple algorithm versions, with some characteristics placing them within broad types called “modes” – see Lohmöller (1989), the PLS "bible", for more details. Personal computers were not that powerful in the 1980s.

Moreover, the type of nonlinear treatment employed by WarpPLS is difficult to perform with Lohmöller’s underlying algorithm (the "good neighbor" assumption), whereby the outer model is influenced by the inner model. The problem is that with Lohmöller’s algorithm, as a model changes, the weights and loadings also change, even if the latent variables do not change. That is, with Lohmöller’s algorithm, two models with the same latent variables but different structures (i.e., links among latent variables) will have different weights and loadings.

The weights of formative latent variables will be essentially the same in WarpPLS as they would be if the variables were defined as reflective. That is, they will be obtained by an iterative algorithm that stops when two conditions are met: (a) the weights between indicators and latent variable are standardized partial regression coefficients calculated with the indicators as independent variables and the latent variable as the dependent variable; and (b) the regression equation expressing the latent variable as a combination of the indicators has an error term of zero.

So why should the user define a latent variable as formative or reflective? The reason are the interpretations of the outputs generated by the software. When a latent variable is formative, both the P values for the weights and the variance inflation factors for the indicators should be generally low; ideally below 0.05 and 2.5, respectively.

True formative variables are fundamentally different from true reflective variables; there are cases that can be seen as “in between” formative and reflective. True formative and reflective variables behave differently, whether the software treats them differently or not. For example, with true formative variables you would expect indicators to be significantly associated with the scores of their respective latent variable; which is indicated by low P values for their weights. However, you would

The way formative variables are treated in Lohmöller’s approach leads to unstable weights, with the signs of weights frequently changing in the resample set. See Temme et al. (2006) for a discussion on this phenomenon. Lohmöller’s approach also leads to “lateral” collinearity; or collinearity between predictor and criteria latent variables. This “stealth” type of collinearity often leads to inflated path coefficients for links involving formative latent variables.

Formative variables don't "become reflective", or vice-versa, if one or another algorithm is used. This is a common misconception among users of PLS-based SEM software.

Lohmöller, J.-B. (1989).

Temme, D., Kreis, H., & Hildebrandt, L. (2006).

Generally speaking, the PLS regression algorithm generates coefficients that are more stable and robust – i.e., reliable for hypothesis testing. It also tends to minimize collinearity. On the other hand, it may be lead to a higher demand for computational power in some cases, which may be the reason why modified versions have been implemented. Lohmöller discusses multiple algorithm versions, with some characteristics placing them within broad types called “modes” – see Lohmöller (1989), the PLS "bible", for more details. Personal computers were not that powerful in the 1980s.

Moreover, the type of nonlinear treatment employed by WarpPLS is difficult to perform with Lohmöller’s underlying algorithm (the "good neighbor" assumption), whereby the outer model is influenced by the inner model. The problem is that with Lohmöller’s algorithm, as a model changes, the weights and loadings also change, even if the latent variables do not change. That is, with Lohmöller’s algorithm, two models with the same latent variables but different structures (i.e., links among latent variables) will have different weights and loadings.

The weights of formative latent variables will be essentially the same in WarpPLS as they would be if the variables were defined as reflective. That is, they will be obtained by an iterative algorithm that stops when two conditions are met: (a) the weights between indicators and latent variable are standardized partial regression coefficients calculated with the indicators as independent variables and the latent variable as the dependent variable; and (b) the regression equation expressing the latent variable as a combination of the indicators has an error term of zero.

So why should the user define a latent variable as formative or reflective? The reason are the interpretations of the outputs generated by the software. When a latent variable is formative, both the P values for the weights and the variance inflation factors for the indicators should be generally low; ideally below 0.05 and 2.5, respectively.

True formative variables are fundamentally different from true reflective variables; there are cases that can be seen as “in between” formative and reflective. True formative and reflective variables behave differently, whether the software treats them differently or not. For example, with true formative variables you would expect indicators to be significantly associated with the scores of their respective latent variable; which is indicated by low P values for their weights. However, you would

*not*normally expect the indicators to be redundant; which is indicated by low variance inflation factors for the indicators.The way formative variables are treated in Lohmöller’s approach leads to unstable weights, with the signs of weights frequently changing in the resample set. See Temme et al. (2006) for a discussion on this phenomenon. Lohmöller’s approach also leads to “lateral” collinearity; or collinearity between predictor and criteria latent variables. This “stealth” type of collinearity often leads to inflated path coefficients for links involving formative latent variables.

Formative variables don't "become reflective", or vice-versa, if one or another algorithm is used. This is a common misconception among users of PLS-based SEM software.

**References**Lohmöller, J.-B. (1989).

*Latent variable path modeling with partial least squares*. Heidelberg, Germany: Physica-Verlag.Temme, D., Kreis, H., & Hildebrandt, L. (2006).

*PLS path modeling – A software review*. Berlin, Germany: Institute of Marketing, Humboldt University Berlin.
Labels:
formative,
Lohmöller,
PLS regression,
reflective,
warppls,
Wold

## Saturday, June 25, 2011

### Dealing with country-specific number punctuation systems

WarpPLS users in countries that adopt number punctuation systems different from that adopted in the USA may have problems when using Excel to manipulate WarpPLS files.

For instance, in Brazil a comma is used to separate the integer from the fractional part of a real number (e.g., 1,431), whereas in the USA a period is used for that purpose (e.g., 1.431).

Because of that, a coefficient calculated by WarpPLS and exported into a .txt file as “1.431” may be read by a Brazilian version of Excel as one thousand four hundred and thirty-one, and not as one plus the 431/1000 fraction.

This tends to happen in certain types of analyses, such as second order latent variable analyses, where WarpPLS outputs are used as inputs after manipulation with country-specific versions of Excel.

A simple way to solve this problem is to use Excel, Notepad, or another simple text editing tool and replace the offending punctuation items, all points with commas (or vice-versa) for example, before using the inputs for other purposes.

For instance, in Brazil a comma is used to separate the integer from the fractional part of a real number (e.g., 1,431), whereas in the USA a period is used for that purpose (e.g., 1.431).

Because of that, a coefficient calculated by WarpPLS and exported into a .txt file as “1.431” may be read by a Brazilian version of Excel as one thousand four hundred and thirty-one, and not as one plus the 431/1000 fraction.

This tends to happen in certain types of analyses, such as second order latent variable analyses, where WarpPLS outputs are used as inputs after manipulation with country-specific versions of Excel.

A simple way to solve this problem is to use Excel, Notepad, or another simple text editing tool and replace the offending punctuation items, all points with commas (or vice-versa) for example, before using the inputs for other purposes.

## Saturday, June 18, 2011

### Testing the significance of mediating effects with WarpPLS using the Preacher & Hayes approach

This post refers to the use of WarpPLS to test a mediating effect using what is often referred to as the Preacher and Hayes approach. This approach employs the Sobel's standard error method (for a recent discussion, see: Kock, 2013). You can also test mediating effects directly with WarpPLS, using indirect and total effect outputs:

http://warppls.blogspot.com/2013/04/testing-mediating-effects-directly-with.html

Previously I also discussed on this blog the classic approach proposed by Baron & Kenny (1986) to test the significance of mediating effects with WarpPLS.

An approach that is an alternative to Baron & Kenny's (1986) approach has been proposed by Preacher & Hayes (2004) to test the significance of mediating effects. This approach has been further extended by Hayes & Preacher (2010) for nonlinear relationships.

These approaches are implemented through an Excel spreadsheet available from the “Resources” area of the WarpPLS.com site, under “Excel files”. The spreadsheet, which implements the Sobel's standard error method, can be used with coefficients generated based on linear and nonlinear analyses.

The Excel spreadsheet above takes as inputs coefficients generated by WarpPLS, including path coefficients and their standard errors. The formulas used in it are discussed in a recent publication (Kock, 2013). The outputs are Sobel’s standard errors, product path coefficients, as well as T and P values, for mediating effects.

**References**

Baron, R.M., & Kenny, D.A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations.

*Journal of Personality & Social Psychology*, 51(6), 1173-1182.

Hayes, A.F., & Preacher, K.J. (2010). Quantifying and testing indirect effects in simple mediation models when the constituent paths are nonlinear.

*Multivariate Behavioral Research*, 45(4), 627-660.

Kock, N. (2013).

*Advanced mediating effects tests, multi-group analyses, and measurement model assessments in PLS-based SEM*. Laredo, Texas: ScriptWarp Systems.

Preacher, K.J., & Hayes, A.F. (2004). SPSS and SAS procedures for estimating indirect effects in simple mediation models.

*Behavior Research Methods, Instruments, & Computers*, 36 (4), 717-731.

### Multi-group analysis with WarpPLS: Comparing path coefficients for two or more group samples

I previously discussed on this post multi-group analysis with WarpPLS from the perspective of comparing means of two or more groups.

A different type of multi-group analysis would be one in which the same model is analyzed for two or more different samples, where each sample refers to a data group.

For example, a researcher could test the same model with data from the USA and Mexico. In this case, two project files would be used, and the goal of the multi-group analysis would be to assess whether the path coefficients differ significantly across groups.

An approach to conduct this type of multi-group analysis, employing the pooled and Satterthwaite standard error methods, is discussed in a recent publication (Kock, 2013). This approach is implemented through an Excel spreadsheet available from the “Resources” area of the WarpPLS.com site, under “Excel files”.

The Excel spreadsheet above takes as inputs coefficients generated by WarpPLS, including path coefficients and their standard errors. The outputs are T and P values for each pair of coefficients being compared. The formulas used in it are discussed in a recent publication (Kock, 2013).

Kock, N. (2013).

A different type of multi-group analysis would be one in which the same model is analyzed for two or more different samples, where each sample refers to a data group.

For example, a researcher could test the same model with data from the USA and Mexico. In this case, two project files would be used, and the goal of the multi-group analysis would be to assess whether the path coefficients differ significantly across groups.

An approach to conduct this type of multi-group analysis, employing the pooled and Satterthwaite standard error methods, is discussed in a recent publication (Kock, 2013). This approach is implemented through an Excel spreadsheet available from the “Resources” area of the WarpPLS.com site, under “Excel files”.

The Excel spreadsheet above takes as inputs coefficients generated by WarpPLS, including path coefficients and their standard errors. The outputs are T and P values for each pair of coefficients being compared. The formulas used in it are discussed in a recent publication (Kock, 2013).

**Reference**Kock, N. (2013).

*Advanced mediating effects tests, multi-group analyses, and measurement model assessments in PLS-based SEM*. Laredo, Texas: ScriptWarp Systems.
Labels:
group samples,
multi-group analysis,
P value,
path coefficient,
warppls

## Friday, April 22, 2011

### Using WarpPLS in E-Collaboration Studies: Descriptive Statistics, Settings, and Key Analysis Results

A new article discussing WarpPLS is available. The article is titled “Using WarpPLS in E-Collaboration Studies: Descriptive Statistics, Settings, and Key Analysis Results”. It has been recently published in the International Journal of e-Collaboration. A full text version of the article is available here as a PDF file. Below is the abstract of the article.

*This is a follow-up on a previous article (Kock, 2010b) discussing the five main steps through which a nonlinear structural equation modeling analysis could be conducted with the software WarpPLS (warppls.com). Both this and the previous article use data from the same e-collaboration study as a basis for the discussion of important WarpPLS features. Unlike in the previous article, the focus here is on specific features related to saving and analyzing grouped descriptive statistics, viewing and changing analysis algorithm and resampling settings, and viewing and saving the various minor and major results of the analysis. Even though its focus is on an e-collaboration study this article contributes to the broad literature on multivariate analysis methods, in addition to the more specific research literature on e-collaboration. The reason for this is that the vast majority of relationships between variables, in investigations of both natural and behavioral phenomena, are nonlinear; usually taking the form of U and S curves. Structural equation modeling software tools, whether variance- or covariance-based, typically do not estimate coefficients of association based on nonlinear analysis algorithms. WarpPLS is an exception in this respect. Without taking nonlinearity into consideration, the results can be misleading; especially in complex and multi-factorial situations such as those stemming from e-collaboration in virtual teams.*## Wednesday, April 20, 2011

### Transitioning from WarpPLS 1.0 to 2.0

Transitioning from version 1.0 to 2.0 of WarpPLS is very easy. Even though both can be installed and ran on the same computer, and should not interfere with each other, I recommend using only the latest version.

There is no need to uninstall version 1.0, but if you want to do so you can follow the instructions at the beginning of the User Manual for version 1.0.

Version 1.0 users can enter the same license information as for version 1.0; it will work for version 2.0 for the remainder of their license periods.

Project files generated with version 1.0 can be used with version 2.0, but only after running Step 5 again. This is needed because version 2.0 generates additional estimates.

Enjoy!

There is no need to uninstall version 1.0, but if you want to do so you can follow the instructions at the beginning of the User Manual for version 1.0.

Version 1.0 users can enter the same license information as for version 1.0; it will work for version 2.0 for the remainder of their license periods.

Project files generated with version 1.0 can be used with version 2.0, but only after running Step 5 again. This is needed because version 2.0 generates additional estimates.

Enjoy!

## Friday, April 8, 2011

### Two new WarpPLS workshops in April and May of 2011

PLS-SEM.com will conduct two new online workshops on WarpPLS in April and May of 2011!

For more information on these and other WarpPLS workshops please visit:

http://pls-sem.com

For more information on these and other WarpPLS workshops please visit:

http://pls-sem.com

## Wednesday, April 6, 2011

### Version 2.0 of WarpPLS is now available!

Version 2.0 of WarpPLS is now available!

Download and install it for a free trial from:

http://warppls.com

Enjoy!

Download and install it for a free trial from:

http://warppls.com

Enjoy!

## Friday, March 11, 2011

### Version 2.0 of WarpPLS will be available soon!

Version 2.0 of WarpPLS is currently being tested, and will be available soon, barring any unexpected problems.

Here is a list of new features in this version:

- APC fit index. The algorithm that calculates the average path coefficient (APC) was modified to correct a problem that was leading it to be underestimated for some models.

- Drag-and-drop user interface. Latent variables can now be moved around, during the model creation/editing step, via drag-and-drop actions.

- Help menu options. Most menus now provide one-click access to Web resources, including Web videos and the WarpPLS blog.

- Incremental code optimization. At several points the code was optimized for speed.

- Loadings and cross-loadings. The software now provides the following tables in connection with loadings and cross-loadings, from a confirmatory factor analysis, on both the screen and model estimates .txt file: combined loadings and cross-loadings, pattern loadings and cross-loadings, and structure loadings and cross-loadings.

- Moderating relationship visualization. The software now shows two plots for moderating relationships, referring to low and high values of the moderating variable. They can be viewed through the “View/plot linear and nonlinear relationships among latent variables” option, under “View and save results”.

- Saving model diagrams as .jpg files. The software now allows for model diagrams, with and without results, to be saved into .jpg files for later inclusion in reports.

- Standard errors for path coefficients. The software now reports standard errors for path coefficients, on both the screen and the model estimates .txt file. These allow for newer mediating effect tests to be conducted, in addition to Baron & Kenny’s (1986) test. The newer tests include those proposed by Preacher & Hayes (2004), and Hayes & Preacher (2010). These citations are fully referenced in the newest version of the User Manual.

- Support for .xlsx Excel files. The software now reads new .xlsx Excel files. Excel workbooks with multiple sheets can also be used, in which case the sheet with the data to be analyzed must be the first in the workbook.

- System.wrp file. The extension of the “System.sys” file was changed, and the content of the file was modified to store a few additional parameters necessary for code optimization. The file is now called “System.wrp”.

- VIFs for formative indicators. The software now calculates variance inflation factors (VIFs) for the indicators of formative latent variables, which can be used for indicator redundancy assessment.

Stay tuned!

Here is a list of new features in this version:

- APC fit index. The algorithm that calculates the average path coefficient (APC) was modified to correct a problem that was leading it to be underestimated for some models.

- Drag-and-drop user interface. Latent variables can now be moved around, during the model creation/editing step, via drag-and-drop actions.

- Help menu options. Most menus now provide one-click access to Web resources, including Web videos and the WarpPLS blog.

- Incremental code optimization. At several points the code was optimized for speed.

- Loadings and cross-loadings. The software now provides the following tables in connection with loadings and cross-loadings, from a confirmatory factor analysis, on both the screen and model estimates .txt file: combined loadings and cross-loadings, pattern loadings and cross-loadings, and structure loadings and cross-loadings.

- Moderating relationship visualization. The software now shows two plots for moderating relationships, referring to low and high values of the moderating variable. They can be viewed through the “View/plot linear and nonlinear relationships among latent variables” option, under “View and save results”.

- Saving model diagrams as .jpg files. The software now allows for model diagrams, with and without results, to be saved into .jpg files for later inclusion in reports.

- Standard errors for path coefficients. The software now reports standard errors for path coefficients, on both the screen and the model estimates .txt file. These allow for newer mediating effect tests to be conducted, in addition to Baron & Kenny’s (1986) test. The newer tests include those proposed by Preacher & Hayes (2004), and Hayes & Preacher (2010). These citations are fully referenced in the newest version of the User Manual.

- Support for .xlsx Excel files. The software now reads new .xlsx Excel files. Excel workbooks with multiple sheets can also be used, in which case the sheet with the data to be analyzed must be the first in the workbook.

- System.wrp file. The extension of the “System.sys” file was changed, and the content of the file was modified to store a few additional parameters necessary for code optimization. The file is now called “System.wrp”.

- VIFs for formative indicators. The software now calculates variance inflation factors (VIFs) for the indicators of formative latent variables, which can be used for indicator redundancy assessment.

Stay tuned!

Subscribe to:
Posts (Atom)