tag:blogger.com,1999:blog-29820976379196848152017-12-30T04:30:17.614-08:00WarpPLSNed Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.comBlogger113125tag:blogger.com,1999:blog-2982097637919684815.post-62533830271143253762018-04-14T12:36:00.000-07:002017-09-15T12:00:03.965-07:00PLS Applications Symposium; 11 - 13 April 2018; Laredo, Texas<br />PLS Applications Symposium; 11 - 13 April 2018; Laredo, Texas<br />(Abstract submissions accepted until 15 February 2018)<br /><br />*** Only abstracts are needed for the submissions ***<br /><br />The partial least squares (PLS) method has increasingly been used in a variety of fields of research and practice, particularly in the context of PLS-based structural equation modeling (SEM). The focus of this Symposium is on the application of PLS-based methods, from a multidisciplinary perspective. For types of submissions, deadlines, and other details, please visit the Symposium’s web site:<br /><br /><a href="http://plsas.net/" target="_blank">http://plsas.net</a><br /><br />*** Workshop on PLS-SEM ***<br /><br />On 11 April 2018 a full-day workshop on PLS-SEM will be conducted by Dr. Ned Kock and Dr. Geoffrey Hubona, using the software WarpPLS. Dr. Kock is the original developer of this software, which is one of the leading PLS-SEM tools today; used by thousands of researchers from a wide variety of disciplines, and from many different countries. Dr. Hubona has extensive experience conducting research and teaching topics related to PLS-SEM, using WarpPLS and a variety of other tools. This workshop will be hands-on and interactive, and will have two parts: (a) basic PLS-SEM issues, conducted in the morning (9 am - 12 noon) by Dr. Hubona; and (b) intermediate and advanced PLS-SEM issues, conducted in the afternoon (2 pm - 5 pm) by Dr. Kock. Participants may attend either one, or both of the two parts.<br /><br />The following topics, among others, will be covered - Running a Full PLS-SEM Analysis - Conducting a Moderating Effects Analysis - Viewing Moderating Effects via 3D and 2D Graphs - Creating and Using Second Order Latent Variables - Viewing Indirect and Total Effects - Viewing Skewness and Kurtosis of Manifest and Latent Variables - Viewing Nonlinear Relationships - Solving Collinearity Problems - Conducting a Factor-Based PLS-SEM Analysis - Using Consistent PLS Factor-Based Algorithms - Exploring Statistical Power and Minimum Sample Sizes - Exploring Conditional Probabilistic Queries - Exploring Full Latent Growth - Conducting Multi-Group Analyses - Assessing Measurement Invariance - Creating Analytic Composites.<br /><br />-----------------------------------------------------------<br />Ned Kock<br />Symposium Chair<br /><a href="http://plsas.net/" target="_blank">http://plsas.net</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-66108064984528866982017-12-04T05:40:00.000-08:002017-12-04T09:17:42.720-08:00Data labels<br />In WarpPLS data labels can be added through the menu options “Add data labels from clipboard” and “Add data labels from file”. Data labels are text identifiers that are entered by you through these options, one column at a time.<br /><br />Like the original numeric dataset, the data labels are stored in a table. Each column of this table refers to one data label variable, and each row to the corresponding row of the original numeric dataset.<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-EUbuCHhkWOE/WiVYD8zU8NI/AAAAAAAABXY/aPZexu71dnwHxZPnn78yyTphZSEnt14rwCLcBGAs/s1600/Temp04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="575" data-original-width="941" height="243" src="https://1.bp.blogspot.com/-EUbuCHhkWOE/WiVYD8zU8NI/AAAAAAAABXY/aPZexu71dnwHxZPnn78yyTphZSEnt14rwCLcBGAs/s400/Temp04.png" width="400" /></a></div><br /><br />Data labels can be shown on graphs (as illustrated above), either next to each data point that they refer to, or as part of the legend for a graph. The short video linked below illustrates this.<br /><br /><a href="https://youtu.be/i5-_WIMXVl4" target="_blank">https://youtu.be/i5-_WIMXVl4</a><br /><br />Once they have been added, data labels can be viewed or saved using the “View or save data labels” option.<br /><br />Data labels can also be used to discover moderating effects, as discussed in the blog post linked below.<br /><br /><a href="http://warppls.blogspot.com/2014/02/using-data-labels-to-discover.html" target="_blank">http://warppls.blogspot.com/2014/02/using-data-labels-to-discover.html</a><br /><br />This can be done in conjunction with the “Explore full latent growth” option, which provides a powerful alternative for the identification of moderating effects:<br /><br /><a href="https://warppls.blogspot.com/2017/10/full-latent-growth.html" target="_blank">https://warppls.blogspot.com/2017/10/full-latent-growth.html</a><br /><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-86898265390132384482017-10-05T17:32:00.000-07:002017-10-05T17:32:13.386-07:00True composite and factor reliabilities<br />The menu option “Explore additional coefficients and indices”, available in WarpPLS starting in version 6.0, allows you to obtain an extended set of reliabilities. The extended set of reliabilities includes the classic reliability coefficients already available in the previous version of this software, plus the following, for each latent variable in your model: Dijkstra's PLSc reliability (also available via the new menu option “Explore Dijkstra's consistent PLS outputs”), true composite reliability, and factor reliability. When factor-based PLS algorithms are used in analyses, the true composite reliability and the factor reliability are produced as estimates of the reliabilities of the true composites and factors. They are calculated in the same way as the classic composite reliabilities available from the previous version of this software, but with different loadings. When classic composite-based (i.e., non-factor-based) algorithms are used, both true composites and factors coincide, and are approximated by the composites generated by the software. As such, true composite and factor reliabilities equal the corresponding composite reliabilities whenever composite-based algorithms are used.<br /><br />Related YouTube video:<br /><br />Explore True Composite and Factor Reliabilities in WarpPLS<br /><br /><a href="http://youtu.be/DwslOCEvOd4" target="_blank">http://youtu.be/DwslOCEvOd4</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-76899088784207873212017-10-05T17:27:00.000-07:002017-10-05T17:28:52.312-07:00Fit indices comparing indicator correlation matrices<br />The new menu option “Explore additional coefficients and indices”, available in WarpPLS starting in version 6.0, allows you to obtain an extended set of model fit and quality indices. The extended set of model fit and quality indices includes the classic indices already available in the previous version of this software, as well as new indices that allow investigators to assess the fit between the model-implied and empirical indicator correlation matrices. These new indices are the standardized root mean squared residual (SRMR), standardized mean absolute residual (SMAR), standardized chi-squared (SChS), standardized threshold difference count ratio (STDCR), and standardized threshold difference sum ratio (STDSR). As with the classic model fit and quality indices, the interpretation of these new indices depends on the goal of the SEM analysis. Since these indices refer to the fit between the model-implied and empirical indicator correlation matrices, they become more meaningful when the goal is to find out whether one model has a better fit with the original data than another, particularly when used in conjunction with the classic indices. When assessing the model fit with the data, several criteria are recommended. These criteria are discussed in the WarpPLS User Manual.<br /><br />Related YouTube video:<br /><br />Explore Indicator Correlation Matrix Fit Indices in WarpPLS<br /><br /><a href="http://youtu.be/YutkhEPW-CE" target="_blank">http://youtu.be/YutkhEPW-CE</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-88716346537688859212017-10-05T17:22:00.000-07:002017-10-05T17:22:09.389-07:00Dijkstra's consistent PLS outputs<br />The menu option “Explore Dijkstra's consistent PLS outputs”, available in WarpPLS starting in version 6.0, allows you to obtain key outputs generated based on Dijkstra's consistent PLS (a.k.a. PLSc) technique. These outputs include PLSc reliabilities for each latent variable, also referred to as Dijkstra's rho_a's, which appear to be, in many contexts, better approximations of the true reliabilities than the measures usually reported in PLS-based SEM contexts – the composite reliability and Cronbach’s alpha coefficients. Also included in the outputs generated via this menu option are PLSc loadings; along with the corresponding standard errors, one-tailed and two-tailed P values, T ratios, and confidence intervals.<br /><br />Related YouTube video:<br /><br />Explore Dijkstra's Consistent PLS Outputs in WarpPLS<br /><br /><a href="http://youtu.be/WdKogy29OVg" target="_blank">http://youtu.be/WdKogy29OVg</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-54396906657560965882017-10-05T17:18:00.001-07:002017-10-05T17:18:53.730-07:00Categorical-to-numeric conversion<br />The menu option “Explore categorical-numeric-categorical conversion”, available in WarpPLS starting in version 6.0, allows you to perform categorical-to-numeric conversions. In a categorical-to-numeric conversion a user can convert a categorical variable, stored as a data label variable, into a numeric variable that is added to the dataset as a new standardized indicator. This new variable can then be used as a new indicator of an existing latent variable, or as a new latent variable with only one-indicator. Three categorical-to-numeric conversion modes, to be used under different circumstances, are available: anchor-factorial with fixed variation, anchor-factorial with variation diffusion, and anchor-factorial with variation sharing.<br /><br />Related YouTube video:<br /><br />Explore Categorical-to-Numeric Conversion in WarpPLS<br /><br /><a href="http://youtu.be/XsytZqX7DBc" target="_blank">http://youtu.be/XsytZqX7DBc</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-1147393771543006342017-10-05T17:14:00.001-07:002017-10-05T17:32:46.262-07:00Numeric-to-categorical conversion<br />The menu option “Explore categorical-numeric-categorical conversion”, available in WarpPLS starting in version 6.0, allows you to perform numeric-to-categorical conversions. In a numeric-to-categorical conversion one or more of the following are converted into a single data label variable: latent variable, standardized indicator, or unstandardized indicator. This option is useful in multi-group analyses where the investigator wants to employ more than one numeric field for grouping. For example, let us assume that the following two unstandardized indicators are available: C, with the values 1 and 0 referring to individuals from the countries of Brazil and New Zealand; and G, with the values 1 and 0 referring to females and males. By using a numeric-to-categorical conversion a researcher could create a new data label variable to conduct a multi-group analysis based on four groups: “C=1G=1”, “C=1G=0”, “C=0G=1” and “C=0G=0”.<br /><br />Related YouTube video:<br /><br />Explore Numeric-to-Categorical Conversion in WarpPLS<br /><br /><a href="http://youtu.be/TWTC-5pqKx8" target="_blank">http://youtu.be/TWTC-5pqKx8</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-46706776248545250572017-10-05T17:11:00.000-07:002017-10-05T17:33:20.248-07:00Reciprocal relationships assessment <br />Instrumental variables, available in WarpPLS starting in version 6.0, can be used to estimate reciprocal (or non-recursive) relationships. For this, you should use the sub-option “Reciprocal stochastic variation sharing”, under the new menu option “Explore analytic composites and instrumental variables”. To illustrate the sub-option “Reciprocal stochastic variation sharing” let us consider a population model with the following links: A > C, B > D, C > D and D > C. To test the reciprocal relationship between C and D you should first control for endogeneity in C and D, due to variation coming from B and A respectively, by creating two instrumental variables iC and iD via the sub-option “Single stochastic variation sharing” and adding these variables to the model. Next you should create two other instrumental variables through the sub-option “Reciprocal stochastic variation sharing”, which we will call here iCrD and iDrC, referring to the conceptual reciprocal links C > D and D > C respectively. (No links between C and D should be included in the model graph, since reciprocal links cannot be directly represented in this version of this software.) The final model, with all the links, will be as follows: A > C, iC > C, B > D, iD > D, iDrC > D and iCrD > C. Here the link iDrC > D represents the conceptual link C > D, and can be used to test this conceptual link; and the link iCrD > C represents the conceptual link D > C, and can similarly be used to test this conceptual link.<br /><br />Related YouTube video:<br /><br />Estimate Reciprocal Relationships in WarpPLS<br /><br /><a href="http://youtu.be/jn8VZaOWe90" target="_blank">http://youtu.be/jn8VZaOWe90</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-60091276919652625792017-10-05T17:02:00.000-07:002017-10-05T17:33:45.333-07:00Analytic composites and instrumental variables<br />Analytic composites are weighted aggregations of indicators where the relative weights are set by you, usually based on an existing theory. The menu option “Explore analytic composites and instrumental variables”, available in WarpPLS starting in version 6.0, allows you to create analytic composites. This menu option also allows you to create instrumental variables. Instrumental variables are variables that selectively share variation with other variables, and only with those variables. Instrumental variables can be used to test and control for endogeneity.<br /><br />Related YouTube videos:<br /><br />Explore Analytic Composites in WarpPLS<br /><br /><a href="http://youtu.be/bxGi0OY8RD4" target="_blank">http://youtu.be/bxGi0OY8RD4</a><br /><br />Test and Control for Endogeneity in WarpPLS<br /><br /><a href="http://youtu.be/qCvvUxR978U" target="_blank">http://youtu.be/qCvvUxR978U</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-23891542733644750752017-10-05T16:58:00.001-07:002017-10-05T17:34:07.606-07:00Multi-group analyses and measurement invariance assessment<br />The menu options “Explore multi-group analyses” and “Explore measurement invariance”, available in WarpPLS starting in version 6.0, now allow you to conduct analyses where the data is segmented in various groups, all possible combinations of pairs of groups are generated, and each pair of groups is compared. In multi-group analyses normally path coefficients are compared, whereas in measurement invariance assessment the foci of comparison are loadings and/or weights. The grouping variables can be unstandardized indicators, standardized indicators, and labels. These types of analyzes can also be conducted via the new menu option “Explore full latent growth”, which presents several advantages (as discussed in the WarpPLS User Manual).<br /><br />Related YouTube videos:<br /><br />Explore Multi-Group Analyses in WarpPLS<br /><br /><a href="http://youtu.be/m2VKQGET-K8" target="_blank">http://youtu.be/m2VKQGET-K8</a><br /><br />Explore Measurement Invariance in WarpPLS<br /><br /><a href="http://youtu.be/29VqsAjhzqQ" target="_blank">http://youtu.be/29VqsAjhzqQ</a><br /><br /><div><br /></div>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-10260390147206605282017-10-05T16:54:00.000-07:002017-10-05T17:34:37.151-07:00Full latent growth<br />Sometimes the actual inclusion of moderating variables and corresponding links in a model leads to problems; e.g., increases in collinearity levels, and the emergence of instances of Simpson’s paradox. The menu option “Explore full latent growth”, available in WarpPLS starting in version 6.0, allows you to completely avoid these problems, and estimate the effects of a latent variable or indicator on all of the links in a model (all at once), without actually including the variable in the model. Moreover, growth in coefficients associated with links among different latent variables and between a latent variable and its indicators, can be estimated; allowing for measurement invariance tests applied to loadings and/or weights.<br /><br />Related YouTube video:<br /><br />Explore Full Latent Growth in WarpPLS<br /><br /><a href="http://youtu.be/x_2e8DVyRhE" target="_blank">http://youtu.be/x_2e8DVyRhE</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-32517969298501813852017-10-05T16:49:00.000-07:002017-10-05T17:34:58.984-07:00Conditional probabilistic queries<br />If an analysis suggests that two variables are causally linked, yielding a path coefficient of 0.25 for example, this essentially means in probabilistic terms that an increase in the predictor variable leads to an increase in the conditional probability that the criterion variable will be above a certain value. Yet, conditional probabilities cannot be directly estimated based on path coefficients; and those probabilities may be of interest to both researchers and practitioners. By using the “Explore conditional probabilistic queries” menu option, users of WarpPLS can, starting in version 6.0, estimate conditional probabilities via queries including combinations of latent variables, unstandardized indicators, standardized indicators, relational operators (e.g., > and <=), and logical operators (e.g., & and |).<br /><br />Related YouTube video:<br /><br /><a href="http://youtu.be/flLtdxmFj2A" target="_blank">Explore Conditional Probabilistic Queries in WarpPLS</a><br /><br /><div><br /></div>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-10331864351753747212017-10-01T07:01:00.000-07:002017-10-01T07:01:16.977-07:00Endogeneity assessment and control<br />Instrumental variables can be used in WarpPLS, starting in version 6.0, to test and control for endogeneity, which occurs when the structural error term for an endogenous variable is correlated with any of the variable’s predictors. For example, let us consider a simple population model with the following links A > B and B > C. This model presents endogeneity with respect to C, because variation flows from A to C via B, leading to a biased estimation of the path for the link B > C via ordinary least squares regression. Adding a link from A to C could be argued as “solving the problem”, but in fact it creates the possibility of a type I error, since the link A > C does not exist at the population level. A more desirable solution to this problem is to create an instrumental variable iC, incorporating only the variation of A that ends up in C and nothing else, and revise the model so that it has the following links: A > B, B > C and iC > C. The link iC > C can be used to test for endogeneity, via its P value and effect size. This link (i.e., iC > C) can also be used to control for endogeneity, thus removing the bias when the path coefficient for the link B > C is estimated via ordinary least squares regression. To create instrumental variables to test and control for endogeneity you should use the sub-option “Single stochastic variation sharing”, under the new menu option “Explore analytic composites and instrumental variables”.<br /><br />Related YouTube video:<br /><br />Test and Control for Endogeneity in WarpPLS<br /><br /><a href="http://youtu.be/qCvvUxR978U" target="_blank">http://youtu.be/qCvvUxR978U</a><br /><div><br /></div>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-25425833500347515152017-10-01T06:55:00.000-07:002017-10-03T06:12:42.386-07:00Statistical power and minimum sample size requirements<br />The WarpPLS menu option “Explore statistical power and minimum sample size requirements”, available starting in version 6.0, allows you to obtain estimates of the minimum required sample sizes for empirical studies based on the following model elements: the minimum absolute significant path coefficient in the model (e.g., 0.21), the significance level used for hypothesis testing (e.g., 0.05), and the power level required (e.g., 0.80). Two methods are used to estimate minimum required sample sizes, the inverse square root and gamma-exponential methods. These methods simulate Monte Carlo experiments, and thus produce estimates that are in line with the estimates that would be produced through the Monte Carlo method.<br /><br />Related YouTube video:<br /><br />Explore Statistical Power and Minimum Sample Size in WarpPLS<br /><a href="http://youtu.be/mGT6-NKUe3E" target="_blank">http://youtu.be/mGT6-NKUe3E</a><br /><br />Article in the <i>Information Systems Journal</i> discussing the methods:<br /><a href="http://onlinelibrary.wiley.com/doi/10.1111/isj.12131/full" target="_blank">http://onlinelibrary.wiley.com/doi/10.1111/isj.12131/full</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-37907296271372135042017-10-01T06:43:00.003-07:002017-10-01T07:04:27.314-07:00Factor-based SEM<br />There has been a long and in some instances fairly antagonistic debate among proponents and detractors of the use of Wold’s original partial least squares (PLS) algorithms in the context of structural equation modeling (SEM). This debate has been fueled by one key issue: Wold’s original PLS algorithms do not deal with actual factors, as covariance-based SEM algorithms do; but with composites, which are exact linear combinations of indicators. The factor-based SEM algorithms in WarpPLS have been developed specifically to address this perceived limitation of Wold’s original PLS algorithms.<br /><br />Related YouTube videos:<br /><br />Conduct a Factor-Based PLS-SEM Analysis with WarpPLS<br /><a href="http://youtu.be/PvXuD5COezU" target="_blank">http://youtu.be/PvXuD5COezU</a><br /><br />Use Consistent PLS Factor-Based Algorithms in WarpPLS<br /><a href="http://youtu.be/I5x4SuQHdME" target="_blank">http://youtu.be/I5x4SuQHdME</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-90767149174912382232017-08-16T11:58:00.004-07:002017-08-16T11:59:56.356-07:00A thank you note to the participants in the workshop on PLS-SEM with WarpPLS; 12-13 August 2017; Penang, Malaysia<br />I would like to thank the participants in the workshop on PLS-SEM with WarpPLS, conducted on 12-13 August 2017. The workshop took place in the beautiful Penang, Malaysia. The group in the workshop was very smart and inquisitive, making it highly interactive.<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-yfz8nAkfBOM/WZSUb8xnHfI/AAAAAAAABWY/DH0nxO62Zl0lGz-cvJdDB1kBXzar3yDKwCLcBGAs/s1600/GroupPhoto.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="957" data-original-width="1440" height="424" src="https://1.bp.blogspot.com/-yfz8nAkfBOM/WZSUb8xnHfI/AAAAAAAABWY/DH0nxO62Zl0lGz-cvJdDB1kBXzar3yDKwCLcBGAs/s640/GroupPhoto.jpg" width="640" /></a></div><br />Several of the participants were expert users of WarpPLS, and highly knowledge about structural equation modeling (SEM) issues; both from applied and philosophical perspectives. We used WarpPLS version 6.0 (see blog post below).<br /><br /><a href="https://warppls.blogspot.com/2017/06/warppls-60-now-available-consistent-pls.html" target="_blank">https://warppls.blogspot.com/2017/06/warppls-60-now-available-consistent-pls.html</a><br /><br />Thank you, and all the best to all of the participants!<br /><br />Ned<br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-72526435763940158452017-07-30T07:22:00.000-07:002017-10-01T07:05:04.195-07:00WarpPLS 6.0 now available: consistent PLS, power analyses, conditional probabilities, latent growth, endogeneity control, new fit indices, reciprocal relationships, and more!<br />Dear colleagues:<br /><br /><b>Version 6.0 of WarpPLS is now available, as a stable version (upgraded from beta on July 30, 2017)</b>. You can download and install it for a free trial from:<br /><br /><a href="http://warppls.com/" target="_new">http://warppls.com</a><br /><br />The full User Manual is also available for download from the web site above separately from the software.<br /><br /><b>Some important notes for users of previous versions:</b><br /><br />- There is no need to uninstall previous versions of WarpPLS to be able to install and use this new version.<br /><br />- Users of previous versions can use the same license information that they already have; it will work for version 6.0 for the remainder of their license periods.<br /><br />- Project files generated with previous versions are automatically converted to version 6.0 project files. Users are notified of that by the software, and given the opportunity not to convert the files if they so wish.<br /><br />- The MATLAB Compiler Runtime 7.14, used in this version, is the same as the one used in versions 2.0-5.0. Therefore, if you already have one of those versions of WarpPLS installed on your computer, you should not reinstall the Runtime.<br /><br />WarpPLS is a powerful PLS-based structural equation modeling (SEM) software. Since its first release in 2009, its user base has grown steadily, now comprising more than 7,000 users in over 33 countries.<br /><br /><b>Some of the most distinguishing features of WarpPLS are the following:</b><br /><br />- It is very easy to use, with a step-by-step user interface guide.<br /><br />- It provides composite-based as well as factor-based algorithms.<br /><br />- It identifies nonlinear relationships, and estimates path coefficients accordingly.<br /><br />- It also models linear relationships, using standard PLS algorithms.<br /><br />- It models reflective and formative variables, as well as moderating effects.<br /><br />- It calculates P values, model fit indices, and collinearity estimates.<br /><br /><br />At the beginning of the User Manual you will see a list of <b>new features in this version, some of which are listed below</b>. The User Manual has more details on how these new features can be useful in SEM analyses.<br /><br />- <b>Factor-based PLS algorithms building on consistent PLS</b>. There has been a long and in some instances fairly antagonistic debate among proponents and detractors of the use of Wold’s original PLS algorithms in the context of SEM. This debate has been fueled by one key issue: Wold’s original PLS algorithms do not deal with actual factors, as covariance-based SEM algorithms do; but with composites, which are exact linear combinations of indicators. The previous version of this software offered various factor-based PLS algorithms to address this limitation. Those algorithms use the Cronbach’s alpha coefficient as a basis to estimate measurement error and true composite weights. This version of the software continues this tradition, by offering the following new factor-based PLS algorithms: Factor-Based PLS Type CFM3, Factor-Based PLS Type CFM2, Factor-Based PLS Type REG2, and Factor-Based PLS Type PTH2. A common characteristic of these new factor-based PLS algorithms is that they build on Dijkstra's consistent PLS (a.k.a. PLSc) technique, whose reliability measure appears to be, in many contexts, a better approximation of the true reliability than the reliability measures usually reported in PLS-based SEM contexts – the composite reliability and Cronbach’s alpha coefficients.<br /><br />Related YouTube video:<br /> Use Consistent PLS Factor-Based Algorithms in WarpPLS<br /> <a href="https://youtu.be/I5x4SuQHdME" target="_new">https://youtu.be/I5x4SuQHdME</a><br /><br />PowerPoint file (.pptx) for presentation:<br /> Factor-Based SEM Building on Consistent PLS<br /> <a href="http://cits.tamiu.edu/kock/NedPresentations/Research/Kock_2017_PLSFc.pptx" target="_new">http://cits.tamiu.edu/kock/NedPresentations/Research/Kock_2017_PLSFc.pptx</a><br /><div><br /></div><br />- <b>Statistical power and minimum sample size requirements</b>. The new menu option “Explore statistical power and minimum sample size requirements” now allows you to obtain estimates of the minimum required sample sizes for empirical studies based on the following model elements: the minimum absolute significant path coefficient in the model (e.g., 0.21), the significance level used for hypothesis testing (e.g., 0.05), and the power level required (e.g., 0.80). Two methods are used to estimate minimum required sample sizes, the inverse square root and gamma-exponential methods. These methods simulate Monte Carlo experiments, and thus produce estimates that are in line with the estimates that would be produced through the Monte Carlo method.<br /><br />Related YouTube video:<br /> Explore Statistical Power and Minimum Sample Size in WarpPLS<br /> <a href="https://youtu.be/mGT6-NKUe3E" target="_new">https://youtu.be/mGT6-NKUe3E</a><br /><br /><br />- <b>T ratios and confidence intervals</b>. While P values are widely used in PLS-based SEM, as well as in SEM in general, the statistical significances of path coefficients, weights and loadings can also be assessed employing T ratios and/or confidence intervals. These can now be obtained through the new menu option “Explore T ratios and confidence intervals”, which also allows you to set the confidence level to be used.<br /><br />Related YouTube video:<br /> Explore T Ratios and Confidence Intervals in WarpPLS<br /> <a href="https://youtu.be/Xao0T2MxJZM" target="_new">https://youtu.be/Xao0T2MxJZM</a><br /><br /><br />- <b>Conditional probabilistic queries</b>. If an analysis suggests that two variables are causally linked, yielding a path coefficient of 0.25 for example, this essentially means in probabilistic terms that an increase in the predictor variable leads to an increase in the conditional probability that the criterion variable will be above a certain value. Yet, conditional probabilities cannot be directly estimated based on path coefficients; and those probabilities may be of interest to both researchers and practitioners. By using the “Explore conditional probabilistic queries” menu option, users of this software can now estimate conditional probabilities via queries including combinations of latent variables, unstandardized indicators, standardized indicators, relational operators (e.g., > and <=), and logical operators (e.g., & and |).<br /><br />Related YouTube video:<br /> Explore Conditional Probabilistic Queries in WarpPLS<br /> <a href="https://youtu.be/flLtdxmFj2A" target="_new">https://youtu.be/flLtdxmFj2A</a><br /><br /><br />- <b>Full latent growth</b>. Sometimes the actual inclusion of moderating variables and corresponding links in a model leads to problems; e.g., increases in collinearity levels, and the emergence of instances of Simpson’s paradox. The new menu option “Explore full latent growth” now allows you to completely avoid these problems, and estimate the effects of a latent variable or indicator on all of the links in a model (all at once), without actually including the variable in the model. Moreover, growth in coefficients associated with links among different latent variables and between a latent variable and its indicators, can be estimated; allowing for measurement invariance tests applied to loadings and/or weights.<br /><br />Related YouTube video:<br /> Explore Full Latent Growth in WarpPLS<br /> <a href="https://youtu.be/x_2e8DVyRhE" target="_new">https://youtu.be/x_2e8DVyRhE</a><br /><br /><br />- <b>Multi-group analyses and measurement invariance assessment</b>. The new menu options “Explore multi-group analyses” and “Explore measurement invariance” now allow you to conduct analyses where the data is segmented in various groups, all possible combinations of pairs of groups are generated, and each pair of groups is compared. In multi-group analyses normally path coefficients are compared, whereas in measurement invariance assessment the foci of comparison are loadings and/or weights. The grouping variables can be unstandardized indicators, standardized indicators, and labels. As mentioned above, these types of analyzes can now also be conducted via the new menu option “Explore full latent growth”, which presents several advantages (as discussed in the WarpPLS User Manual).<br /><br />Related YouTube video:<br /> Explore Multi-Group Analyses in WarpPLS<br /> <a href="https://youtu.be/m2VKQGET-K8" target="_new">https://youtu.be/m2VKQGET-K8</a><br /><br />Related YouTube video:<br /> Explore Measurement Invariance in WarpPLS<br /> <a href="https://youtu.be/29VqsAjhzqQ" target="_new">https://youtu.be/29VqsAjhzqQ</a><br /><br /><br />- <b>Analytic composites and instrumental variables</b>. Analytic composites are weighted aggregations of indicators where the relative weights are set by you, usually based on an existing theory. The new menu option “Explore analytic composites and instrumental variables” allows you to create analytic composites. This new menu option also allows you to create instrumental variables. Instrumental variables are variables that selectively share variation with other variables, and only with those variables.<br /><br />Related YouTube video:<br /> Explore Analytic Composites in WarpPLS<br /> <a href="https://youtu.be/bxGi0OY8RD4" target="_new">https://youtu.be/bxGi0OY8RD4</a><br /><br />Related YouTube video:<br /> Test and Control for Endogeneity in WarpPLS<br /> <a href="https://youtu.be/qCvvUxR978U" target="_new">https://youtu.be/qCvvUxR978U</a><br /><br /><br />- <b>Endogeneity assessment and control</b>. Instrumental variables can now be used to test and control for endogeneity, which occurs when the structural error term for an endogenous variable is correlated with any of the variable’s predictors. For example, let us consider a simple population model with the following links A > B and B > C. This model presents endogeneity with respect to C, because variation flows from A to C via B, leading to a biased estimation of the path for the link B > C via ordinary least squares regression. Adding a link from A to C could be argued as “solving the problem”, but in fact it creates the possibility of a type I error, since the link A > C does not exist at the population level. A more desirable solution to this problem is to create an instrumental variable iC, incorporating only the variation of A that ends up in C and nothing else, and revise the model so that it has the following links: A > B, B > C and iC > C. The link iC > C can be used to test for endogeneity, via its P value and effect size. This link (i.e., iC > C) can also be used to control for endogeneity, thus removing the bias when the path coefficient for the link B > C is estimated via ordinary least squares regression. To create instrumental variables to test and control for endogeneity you should use the sub-option “Single stochastic variation sharing”, under the new menu option “Explore analytic composites and instrumental variables”.<br /><br />Related YouTube video:<br /> Test and Control for Endogeneity in WarpPLS<br /> <a href="https://youtu.be/qCvvUxR978U" target="_new">https://youtu.be/qCvvUxR978U</a><br /><br /><br />- <b>Reciprocal relationships assessment</b>. Instrumental variables can also be used to estimate reciprocal relationships. For this, you should use the sub-option “Reciprocal stochastic variation sharing”, under the new menu option “Explore analytic composites and instrumental variables”. To illustrate the sub-option “Reciprocal stochastic variation sharing” let us consider a population model with the following links: A > C, B > D, C > D and D > C. To test the reciprocal relationship between C and D you should first control for endogeneity in C and D, due to variation coming from B and A respectively, by creating two instrumental variables iC and iD via the sub-option “Single stochastic variation sharing” and adding these variables to the model. Next you should create two other instrumental variables through the sub-option “Reciprocal stochastic variation sharing”, which we will call here iCrD and iDrC, referring to the conceptual reciprocal links C > D and D > C respectively. (No links between C and D should be included in the model graph, since reciprocal links cannot be directly represented in this version of this software.) The final model, with all the links, will be as follows: A > C, iC > C, B > D, iD > D, iDrC > D and iCrD > C. Here the link iDrC > D represents the conceptual link C > D, and can be used to test this conceptual link; and the link iCrD > C represents the conceptual link D > C, and can similarly be used to test this conceptual link.<br /><br />Related YouTube video:<br /> Estimate Reciprocal Relationships in WarpPLS<br /> <a href="https://youtu.be/jn8VZaOWe90" target="_new">https://youtu.be/jn8VZaOWe90</a><br /><br /><br />- <b>Numeric-to-categorical conversion</b>. The new menu option “Explore categorical-numeric-categorical conversion” now allows you to perform numeric-to-categorical conversions. In a numeric-to-categorical conversion one or more of the following are converted into a single data label variable: latent variable, standardized indicator, or unstandardized indicator. This option is useful in multi-group analyses where the investigator wants to employ more than one numeric field for grouping. For example, let us assume that the following two unstandardized indicators are available: C, with the values 1 and 0 referring to individuals from the countries of Brazil and New Zealand; and G, with the values 1 and 0 referring to females and males. By using a numeric-to-categorical conversion a researcher could create a new data label variable to conduct a multi-group analysis based on four groups: “C=1G=1”, “C=1G=0”, “C=0G=1” and “C=0G=0”.<br /><br />Related YouTube video:<br /> Explore Numeric-to-Categorical Conversion in WarpPLS<br /> <a href="https://youtu.be/TWTC-5pqKx8" target="_new">https://youtu.be/TWTC-5pqKx8</a><br /><br /><br />- <b>Categorical-to-numeric conversion</b>. The new menu option “Explore categorical-numeric-categorical conversion” also allows you to perform categorical-to-numeric conversions. In a categorical-to-numeric conversion a user can convert a categorical variable, stored as a data label variable, into a numeric variable that is added to the dataset as a new standardized indicator. This new variable can then be used as a new indicator of an existing latent variable, or as a new latent variable with only one-indicator. Three categorical-to-numeric conversion modes, to be used under different circumstances, are available: anchor-factorial with fixed variation, anchor-factorial with variation diffusion, and anchor-factorial with variation sharing.<br /><br />Related YouTube video:<br /> Explore Categorical-to-Numeric Conversion in WarpPLS<br /> <a href="https://youtu.be/XsytZqX7DBc" target="_new">https://youtu.be/XsytZqX7DBc</a><br /><br /><br />- <b>Dijkstra's consistent PLS outputs</b>. The new menu option “Explore Dijkstra's consistent PLS outputs” now allows you to obtain key outputs generated based on Dijkstra's consistent PLS (a.k.a. PLSc) technique. These outputs include PLSc reliabilities for each latent variable, also referred to as Dijkstra's rho_a's, which appear to be, in many contexts, better approximations of the true reliabilities than the measures usually reported in PLS-based SEM contexts – the composite reliability and Cronbach’s alpha coefficients. Also included in the outputs generated via this menu option are PLSc loadings; along with the corresponding standard errors, one-tailed and two-tailed P values, T ratios, and confidence intervals.<br /><br />Related YouTube video:<br /> Explore Dijkstra's Consistent PLS Outputs in WarpPLS<br /> <a href="https://youtu.be/WdKogy29OVg" target="_new">https://youtu.be/WdKogy29OVg</a><br /><br /><br />- <b>Fit indices comparing indicator correlation matrices</b>. The new menu option “Explore additional coefficients and indices” now allows you to obtain an extended set of model fit and quality indices. The extended set of model fit and quality indices includes the classic indices already available in the previous version of this software, as well as new indices that allow investigators to assess the fit between the model-implied and empirical indicator correlation matrices. These new indices are the standardized root mean squared residual (SRMR), standardized mean absolute residual (SMAR), standardized chi-squared (SChS), standardized threshold difference count ratio (STDCR), and standardized threshold difference sum ratio (STDSR). As with the classic model fit and quality indices, the interpretation of these new indices depends on the goal of the SEM analysis. Since these indices refer to the fit between the model-implied and empirical indicator correlation matrices, they become more meaningful when the goal is to find out whether one model has a better fit with the original data than another, particularly when used in conjunction with the classic indices. When assessing the model fit with the data, several criteria are recommended. These criteria are discussed in the WarpPLS User Manual.<br /><br />Related YouTube video:<br /> Explore Indicator Correlation Matrix Fit Indices in WarpPLS<br /> <a href="https://youtu.be/YutkhEPW-CE" target="_new">https://youtu.be/YutkhEPW-CE</a><br /><br /><br />- <b>New reliability measures</b>. The new menu option “Explore additional coefficients and indices” now also allows you to obtain an extended set of reliabilities. The extended set of reliabilities includes the classic reliability coefficients already available in the previous version of this software, plus the following, for each latent variable in your model: Dijkstra's PLSc reliability (also available via the new menu option “Explore Dijkstra's consistent PLS outputs”), true composite reliability, and factor reliability. When factor-based PLS algorithms are used in analyses, the true composite reliability and the factor reliability are produced as estimates of the reliabilities of the true composites and factors. They are calculated in the same way as the classic composite reliabilities available from the previous version of this software, but with different loadings. When classic composite-based (i.e., non-factor-based) algorithms are used, both true composites and factors coincide, and are approximated by the composites generated by the software. As such, true composite and factor reliabilities equal the corresponding composite reliabilities whenever composite-based algorithms are used.<br /><br />Related YouTube video:<br /> Explore True Composite and Factor Reliabilities in WarpPLS<br /> <a href="https://youtu.be/DwslOCEvOd4" target="_new">https://youtu.be/DwslOCEvOd4</a><br /><br /><br />Enjoy!<br /><div><br /></div>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-43061344351051189812017-07-09T07:47:00.003-07:002017-07-10T09:17:30.919-07:00Hands-On Workshop on PLS-SEM with WarpPLS; 12-13 August 2017; Penang, Malaysia<br /><b><u>Hands-On Workshop on PLS-SEM with WarpPLS; 12-13 August 2017; Penang, Malaysia</u></b><br /><br /><b>*** <a href="http://warppls.blogspot.com/2017/06/warppls-60-now-available-consistent-pls.html" target="_blank">WarpPLS version 6.0</a> will be used ***</b><br /><br />Structural equation modeling (SEM), or path analysis with latent variables, is one of the most general and comprehensive statistical analysis methods. Path analysis, multiple regression, ANCOVA, ANOVA and other widely used statistical analysis methods can be seen as special cases of SEM.<br /><br />WarpPLS 6.0 is a very user-friendly and powerful SEM software tool, arguably the first of its kind to implement linear and nonlinear algorithms. It provides one of the most extensive sets of SEM outputs. Among other things it automatically calculates indirect and total effects and respective P values, as well as full collinearity estimates.<br /><br />This workshop (details below) is aimed at beginner and intermediate SEM practitioners. Among possible participants are those who are interested in: (a) being productive co-authors or research collaborators, even if not doing SEM analyses themselves; (b) conducting basic SEM analyses occasionally in the future; (c) conducting SEM analyses of intermediate complexity on a regular basis.<br /><br />Participants will receive a one-year license of WarpPLS 6.0. See the link below for more details on this software.<br /><br /><a href="http://warppls.com/" target="_blank">http://warppls.com/</a><br /><br /><br /><b>*** Registration and additional details ***</b><br /><br />Please contact Dr. S. Mostafa Rasoolimanesh via one of the emails below:<br /><br />rasooli1352@yahoo.com<br />wahida_ismail@usm.my<br />hbpusm2017@gmail.com<br /><br /><br /><b>*** Instructor ***</b><br /><br />Ned Kock, Ph.D.<br />WarpPLS Developer<br /><a href="http://nedkock.com/" target="_blank">http://nedkock.com</a><br /><br /><br /><b>*** Location and dates ***</b><br /><br />Vistana Hotel<br />Penang, Malaysia<br />12-13 August 2017 (Sat-Sun), 9 am–5 pm<br /><br />Link with location on Google Maps:<br /><br /><a href="https://goo.gl/maps/2F6QTEF21n92" target="_blank">https://goo.gl/maps/2F6QTEF21n92</a><br /><br /><br /><b>*** Workshop program at a glance ***</b><br /><br />The main goal of this workshop is to give participants a practical understanding of how to use the software WarpPLS to conduct variance-based SEM. The workshop is very hands-on and covers linear and nonlinear applications. Many topics will be covered directly or indirectly, depending on the questions received from the participants, who are welcome to bring their own datasets to the workshop. Below is a tentative overview of the topics covered:<br /><br />Day 1 of workshop<br /><br />- SEM Analysis with WarpPLS (all steps)<br />- Open or Create Project File to Save Work<br />- Read Raw Data Used in SEM Analysis<br />- Pre-process Data for SEM Analysis<br />- Define Variables and Links in SEM Model<br />- Perform SEM Analysis and View Results<br />- View Skewness and Kurtosis<br />- Conduct a Moderating Effects Analysis<br />- View Moderating Effects via 3D and 2D Graphs<br />- Create and Use Second Order Latent Variables<br />- View Indirect and Total Effects<br />- View Nonlinear Relationships<br /><br />Day 2 of workshop<br /><br />- Conduct a Factor-Based PLS-SEM Analysis<br />- Use Consistent PLS Factor-Based Algorithms<br />- Explore Statistical Power and Minimum Sample Size<br />- Explore Conditional Probabilistic Queries<br />- Explore Full Latent Growth<br />- Conduct Multi-Group Analyses<br />- Assess Measurement Invariance<br />- Create Analytic Composites<br />- Test and Control for Endogeneity<br />- Estimate Reciprocal Relationships<br />- Explore Numeric-to-Categorical Conversion<br />- Explore Categorical-to-Numeric Conversion<br />- View Indicator Correlation Matrix Fit Indices<br /><div><br /></div>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-86423130182118269312017-04-12T18:58:00.000-07:002017-04-12T19:03:43.697-07:00A thank you note to the participants in the 2017 PLS Applications Symposium<div class="MsoNormal"><br />This is just a thank you note to those who participated, either as presenters or members of the audience, in the 2017 PLS Applications Symposium:</div><div class="MsoNormal"><br /></div><div class="MsoNormal">http://plsas.net/</div><div class="MsoNormal"><br /></div><div class="MsoNormal">As in previous years, it seems that it was a good idea to run the Symposium as part of the Western Hemispheric Trade Conference. This allowed attendees to take advantage of a subsidized registration fee, and also participate in other Conference sessions and the Conference's social event.</div><div class="MsoNormal"><br /></div><div class="MsoNormal">I have been told that the proceedings will be available soon from the Western Hemispheric Trade Conference web site.</div><div class="MsoNormal"><br /></div><div class="MsoNormal">Also, the full-day workshop on PLS-SEM using the software WarpPLS was well attended. This workshop was fairly hands-on and interactive. Some participants had quite a great deal of expertise in PLS-SEM and WarpPLS. It was a joy to conduct the workshop!</div><div class="MsoNormal"><br /></div><div class="MsoNormal">As soon as we define the dates, we will be announcing next year’s PLS Applications Symposium. Like this years’ Symposium, it will take place in Laredo, Texas, probably in mid-April as well.</div><div class="MsoNormal"><br /></div><div class="MsoNormal">Thank you and best regards to all!</div><div class="MsoNormal"><br /></div><div class="MsoNormal">-----------------------------------------------------------</div><div class="MsoNormal">Ned Kock</div><div class="MsoNormal">Symposium Chair</div><br /><div class="MsoNormal">http://plsas.net</div>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-83139915446040751972017-04-07T07:16:00.000-07:002017-01-04T07:04:19.107-08:00PLS Applications Symposium; 5 - 7 April 2017; Laredo, Texas<br />PLS Applications Symposium; 5 - 7 April 2017; Laredo, Texas<br />(Abstract submissions accepted until 10 February 2017)<br /><br />*** Only abstracts are needed for the submissions ***<br /><br />The partial least squares (PLS) method has increasingly been used in a variety of fields of research and practice, particularly in the context of PLS-based structural equation modeling (SEM). The focus of this Symposium is on the application of PLS-based methods, from a multidisciplinary perspective. For types of submissions, deadlines, and other details, please visit the Symposium’s web site:<br /><br /><a href="http://plsas.net/" target="_blank">http://plsas.net</a><br /><br />*** Workshop on PLS-SEM ***<br /><br />On 5 April 2017 a full-day workshop on PLS-SEM will be conducted by Dr. Ned Kock, using the software WarpPLS. Dr. Kock is the original developer of this software, which is one of the leading PLS-SEM tools today; used by thousands of researchers from a wide variety of disciplines, and from many different countries. This workshop will be hands-on and interactive, and will have two parts: (a) basic PLS-SEM issues, conducted in the morning (9 am - 12 noon); and (b) intermediate and advanced PLS-SEM issues, conducted in the afternoon (2 pm - 5 pm). Participants may attend either one, or both of the two parts.<br /><br />The following topics, among others, will be covered - Running a Full PLS-SEM Analysis - Conducting a Moderating Effects Analysis - Viewing Moderating Effects via 3D and 2D Graphs - Creating and Using Second Order Latent Variables - Viewing Indirect and Total Effects - Viewing Skewness and Kurtosis of Manifest and Latent Variables - Conducting a Multi-group Analysis with Range Restriction - Viewing Nonlinear Relationships - Conducting a Factor-Based PLS-SEM Analysis - Viewing and Changing Missing Data Imputation Settings - Isolating Mediating Effects - Identifying and Dealing with Outliers - Solving Indicator Problems - Solving Collinearity Problems.<br /><br />-----------------------------------------------------------<br />Ned Kock<br />Symposium Chair<br /><a href="http://plsas.net/" target="_blank">http://plsas.net</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-58585047543629995842017-03-07T11:54:00.001-08:002017-11-02T06:01:13.173-07:00Model with endogenous dichotomous variable<br />How do we interpret the results of a model with an endogenous dichotomous variable? Let us use the model below to illustrate the answer to this question. In this model we have one endogenous dichotomous variable “Success” that is significantly caused in a direct way by two predictors: “Projmgt” and “JSat”. The direct effect of a third predictor, namely "ECollab", is relatively small and borderline significant.<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-ZWzm1-fva78/WL8Pvb3iqQI/AAAAAAAABUY/JW7XfdPvEu0A_1x3zLI5ZrJhe4e1RGajACLcB/s1600/Kock_2017_ModelEndoVar.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="164" src="https://3.bp.blogspot.com/-ZWzm1-fva78/WL8Pvb3iqQI/AAAAAAAABUY/JW7XfdPvEu0A_1x3zLI5ZrJhe4e1RGajACLcB/s400/Kock_2017_ModelEndoVar.PNG" width="400" /></a></div><br /><br />Let us assume that the unit of analysis is a team of people. The variable “Success” is coded as 0 or 1, meaning that a team is either successful or not. After standardization, the 0 and 1 will be converted into a negative and a positive number. The standardized version of the variable “Success” will have a mean of zero and a standard deviation of 1. <br /><br />One way to interpret the results is the following. The probability that a team will be successful (i.e., that “Success” > 0) is significantly affected by increases in the variables “Projmgt” and “JSat”. <br /><br />WarpPLS users are able, starting in version 6.0, to calculate conditional probabilities as shown below, without having to resort to transformations based on assumed underlying functions, such as those performed by logistic regression. In this screen shot, only latent variables are used, and they are all assumed to be standardized.<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-Tt4SCQpVVHM/WL8P0hbaLXI/AAAAAAAABUc/nYqL2V7kuKcdXICP5kq16c7qkFEZqNHggCLcB/s1600/Kock_2017_ProbQueryEndoVar.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="239" src="https://3.bp.blogspot.com/-Tt4SCQpVVHM/WL8P0hbaLXI/AAAAAAAABUc/nYqL2V7kuKcdXICP5kq16c7qkFEZqNHggCLcB/s400/Kock_2017_ProbQueryEndoVar.PNG" width="400" /></a></div><br /><br />In the screen shot above, we can see that the probability that a team will be successful (i.e., that “Success” > 0), if “Projmgt” > 1 and “JSat” > 1, is 52.2 percent. Stated differently, if “Projmgt” and “JSat” are high (greater than 1 standard deviation above the mean), then the probability of success is slightly greater than chance. <br /><br />A probability of 52.2 percent is not that high. The reason why it is not higher, in the context of the conditional probabilistic query above, is that we are not including the variable "ECollab" in the mix. Still, it does not seem like “Projmgt” and “JSat” being high are sufficient conditions for success, although they may be necessary conditions. <br /><br />Consider a different set of conditional probabilities. If a team is successful (i.e., if “Success” > 0), what is the probability that “Projmgt” and “JSat” are low for that team. The answer, shown in the screen below, is 1.3 percent. That is a very low probability, suggesting that “Projmgt” and “JSat” matter as necessary but not sufficient elements for success. <br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-pGyQZ-6B8Ts/WL8P7DAZ3YI/AAAAAAAABUg/Ccp7aMG0QvkrKgYoHm7qMhV0AFPBthnsACLcB/s1600/Kock_2017_ProbQueryEndoVar2.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="277" src="https://3.bp.blogspot.com/-pGyQZ-6B8Ts/WL8P7DAZ3YI/AAAAAAAABUg/Ccp7aMG0QvkrKgYoHm7qMhV0AFPBthnsACLcB/s400/Kock_2017_ProbQueryEndoVar2.PNG" width="400" /></a></div><br /><br />These are among the conditional probabilistic queries that users are able to make starting in version 6.0 of WarpPLS. Bayes’ theorem is used to produce the answers to the queries. <br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com1tag:blogger.com,1999:blog-2982097637919684815.post-13257435518990342662016-09-13T08:18:00.000-07:002017-10-10T13:18:28.104-07:00Measurement invariance assessment in PLS-SEM<br />WarpPLS users can assess measurement invariance in PLS-SEM analyses in a way analogous to a multi-group analysis. That is, WarpPLS users can compare pairs of measurement models to ascertain equivalence, using one of the multi-group comparison techniques building on the pooled and Satterthwaite standard error methods discussed in the article below. By doing so, they will ensure that any observed between-group differences in structural model coefficients, particularly in path coefficients, are not due to measurement model differences.<br /><br />Kock, N. (2014). <a href="http://www.scriptwarp.com/warppls/pubs/Kock_2014_UseSEsESsLoadsWeightsSEM.pdf" target="_new"> Advanced mediating effects tests, multi-group analyses, and measurement model assessments in PLS-based SEM.</a><i> International Journal of e-Collaboration</i>, 10(3), 1-13.<br /><br />For measurement invariance assessment, the techniques discussed in the article should be employed with weights and/or loadings. While with path coefficients researchers may be interested in finding statistically significant differences, with weights/loadings the opposite is typically the case – they will want to ensure that differences are not statistically significant. The reason is that significant differences between path coefficients can be artificially induced by significant differences between weights/loadings in different models. <br /><br />A spreadsheet with formulas for conducting a multi-group analysis building on the pooled and Satterthwaite standard error methods is available from <a href="http://warppls.com/" target="_new">WarpPLS.com</a>, under “Resources”. As indicated in the article linked above, this same spreadsheet can be used in the assessment of measurement invariance in PLS-SEM analyses.<br /><br />The menu options “Explore multi-group analyses” and “Explore measurement invariance”, available in WarpPLS starting in version 6.0, allow you to automatically conduct analyses like the ones above. Through these the data is segmented in various groups, all possible combinations of pairs of groups are generated, and each pair of groups is compared. As noted above, in multi-group analyses normally path coefficients are compared, whereas in measurement invariance assessment the foci of comparison are loadings and/or weights. The grouping variables can be unstandardized indicators, standardized indicators, and labels. These types of analyzes can also be conducted via the new menu option “Explore full latent growth”, which presents several advantages (as discussed in the WarpPLS User Manual).<br /><br />Related YouTube videos:<br /><br />Explore Multi-Group Analyses in WarpPLS<br /><br /><a href="http://youtu.be/m2VKQGET-K8" target="_blank">http://youtu.be/m2VKQGET-K8</a><br /><br />Explore Measurement Invariance in WarpPLS<br /><br /><a href="http://youtu.be/29VqsAjhzqQ" target="_blank">http://youtu.be/29VqsAjhzqQ</a><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0tag:blogger.com,1999:blog-2982097637919684815.post-31025847129615678232016-09-13T07:40:00.000-07:002016-09-13T07:55:46.142-07:00Advantages of nonlinear over segmentation analyses in path models<br/>Nonlinear analyses employing the software WarpPLS allow for the identification of linear segments emerging from a nonlinear analysis, but without the need to generate subsamples. A new article is available demonstrating the advantages of nonlinear over data segmentation analyses. These include a larger overall sample size for calculation of P values, and the ability to uncover very high segment-specific path coefficients. Its reference, abstract, and link to full text are available below.<br/><br/> Kock, N. (2016). <a href="http://cits.tamiu.edu/kock/pubs/journals/2016/Kock_2016_IJeC_NonlinDataSegment.pdf" target="_new"> Advantages of nonlinear over segmentation analyses in path models.</a><i> International Journal of e-Collaboration</i>, 12(4), 1-6.<br/><br/> <i> The recent availability of software tools for nonlinear path analyses, such as WarpPLS, enables e-collaboration researchers to take nonlinearity into consideration when estimating coefficients of association among linked variables. Nonlinear path analyses can be applied to models with or without latent variables, and provide advantages over data segmentation analyses, including those employing finite mixture segmentation techniques (a.k.a. FIMIX). The latter assume that data can be successfully segmented into subsamples, which are then analyzed with linear algorithms. Nonlinear analyses employing WarpPLS also allow for the identification of linear segments mirroring underlying nonlinear relationships, but without the need to generate subsamples. We demonstrate the advantages of nonlinear over data segmentation analyses.</i><br/><br/> Among other things this article shows that identification of linear segments emerging from a nonlinear analysis with WarpPLS allows for: (a) a larger overall sample size for calculation of P values, which enables researchers to uncover actual segment-specific effects that could otherwise be rendered non-significant due to a combination of underestimated path coefficients and small subsample sizes; and (b) the ability to uncover very high segment-specific path coefficients, which could otherwise be grossly underestimated.<br/><br/> Enjoy! <br/><br/>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com2tag:blogger.com,1999:blog-2982097637919684815.post-24988425914174563322016-09-01T18:29:00.000-07:002017-08-07T16:43:12.349-07:00Hypothesis testing with confidence intervals and P values<br />While P values are widely used in PLS-based SEM, as well as in SEM in general, the statistical significances of path coefficients, weights and loadings can also be assessed employing T ratios and/or confidence intervals. These can be obtained in WarpPLS through the menu option “Explore T ratios and confidence intervals”, which also allows you to set the confidence level to be used. This menu option becomes available after Step 5 is completed.<br /><br />Related YouTube video: Explore T Ratios and Confidence Intervals in WarpPLS<br /><br /><a href="https://youtu.be/Xao0T2MxJZM" target="_blank">https://youtu.be/Xao0T2MxJZM</a><br /><br />An article is also available explaining how WarpPLS users can test hypotheses based on confidence intervals, contrasting that approach with the one employing P values. A variation of the latter approach, employing T ratios, is also briefly discussed. Below are the reference, link to PDF file, and abstract for the article.<br /><br />Kock, N. (2016). Hypothesis testing with confidence intervals and P values in PLS-SEM. <i>International Journal of e-Collaboration</i>, 12(3), 1-6.<br /><br /><a href="http://cits.tamiu.edu/kock/pubs/journals/2016/Kock_2016_IJeC_ConfIntervalsPathModel.pdf" target="_blank">PDF file</a><br /><br /><u>Abstract:</u><br />E-collaboration researchers usually employ P values for hypothesis testing, a common practice in a variety of other fields. This is also customary in many methodological contexts, such as analyses of path models with or without latent variables, as well as simpler tests that can be seen as special cases of these (e.g., comparisons of means). We discuss here how a researcher can use another major approach for hypothesis testing, the one building on confidence intervals. We contrast this approach with the one employing P values through the analysis of a simulated dataset, created based on a model grounded on past theory and empirical research. The model refers to social networking site use at work and its impact on job performance. The results of our analyses suggest that tests employing confidence intervals and P values are likely to lead to very similar outcomes in terms of acceptance or rejection of hypotheses.<br /><br /><u>Note 1:</u><br />On Table 1 in the article, each T ratio and confidence interval limits (lower and upper) are calculated through the formulas included below. Normally a hypothesis will <i>not</i> be supported if the confidence interval includes the number 0 (zero).<br /><br />T ratio = (path coefficient) / (standard error).<br /><br />Lower confidence interval = (path coefficient) - 1.96 * (standard error).<br /><br />Upper confidence interval = (path coefficient) + 1.96 * (standard error).<br /><br /><u>Note 2:</u><br />Here is a quick note to technical readers. The P values reported in Table 1 in the article are calculated based on the T ratios using the incomplete beta function, which does not assume that the T distribution is exactly normal. In reality, T distributions have heavier tails than normal distributions, with the difference becoming less noticeable as sample sizes increase.<br /><br /><br />Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com5tag:blogger.com,1999:blog-2982097637919684815.post-42220816750057067112016-06-15T06:39:00.000-07:002016-06-15T06:39:43.911-07:00Simpson’s paradox, moderation, and the emergence of quadratic relationships in path models<br/>Among the many innovative features of WarpPLS are those that deal with identification of Simpson’s paradox and modeling of nonlinear relationships. A new article discussing various issues that are important for the understanding of the usefulness of these features is now available. Its reference, abstract, and link to full text are available below.<br/><br/> Kock, N., & Gaskins, L. (2016). <a href="http://cits.tamiu.edu/kock/pubs/journals/2016JournalIJANS_ModJCveNetCorrp/Kock_Gaskins_2016_IJANS_SimpPdox.pdf" target="_new"> Simpson’s paradox, moderation, and the emergence of quadratic relationships in path models: An information systems illustration.</a><i> International Journal of Applied Nonlinear Science</i>, 2(3), 200-234.<br/><br/> <i>While Simpson’s paradox is well-known to statisticians, it seems to have been largely neglected in many applied fields of research, including the field of information systems. This is problematic because of the strange nature of the phenomenon, the wrong conclusions and decisions to which it may lead, and its likely frequency. We discuss Simpson’s paradox and interpret it from the perspective of path models with or without latent variables. We define it mathematically and argue that it arises from incorrect model specification. We also show how models can be correctly specified so that they are free from Simpson’s paradox. In the process of doing so, we show that Simpson’s paradox may be a marker of two types of co-existing relationships that have been attracting increasing interest from information systems researchers, namely moderation and quadratic relationships.</i><br/><br/> Among other things this article shows that: (a) Simpson’s paradox may be caused by model misspecification, and thus can in some cases be fixed by proper model specification; (b) a type of model misspecification that may cause Simpson’s paradox involves missing a moderation relationship that exists at the population level; (c) Simpson’s paradox may actually be a marker of nonlinear relationships of the quadratic type, which are induced by moderation; and (d) there is a duality involving moderation and quadratic relationships, which requires separate and targeted analyses for their proper understanding.<br/><br/> Enjoy! <br/><br/>Ned Kockhttp://www.blogger.com/profile/02755560885749335053noreply@blogger.com0