Tuesday, March 10, 2009

Leadership Effectiveness and Enabling Others to Act

Kouzes and Posner (1997) suggest a strong relationship between the effectiveness of leadership and the process of enabling of others to act. Indeed, leaders create a climate of collaboration that enables others to act.

By fostering collaboration and building spirited teams, they actively involve others under their supervision and outside their responsibility. Leaders help create an atmosphere of trust and human dignity – those who would lead must demonstrate that they understand that encouraging mutual respect sustains extraordinary efforts. Leaders strengthen others through providing followers a choice and sharing information about the tasks being pursued. Each follower feels powerful and capable through leadership responsibility that is shared throughout the organization.

Organizations that foster collaboration by promoting cooperative goals and building trust can reach the goal by consuming fewer resources. Organizations with leaders who share power and information, provide options, help develop core competencies, use people to get important things done and support their efforts strengthen employees in unique and powerful ways toward self-leadership.

In sum, leadership that grows a collaborative environment and self-leadership initiative on the part of employees improves organizational performance.

Reference

Kouzes, J.M., & Posner, B.Z. (1997). The leadership challenge (2nd ed.). San Francisco, CA: Jossey-Bass.

Saturday, February 28, 2009

SPSS Transform Recode into Different Variable in MS Excel to Reverse Response Scales

While SPSS allows one to perform some interesting transforms on variables from within the software package, it is often much cleaner to perform data transformations and variable computations from with MS Excel before data Import into SPSS.

While using MS Excel, suppose your response data for a 1 to 10 “Likert scale” is in column B and you want to recode it into Column C:



Use the Choose Function in MS Excel to reverse scale as follows:
1. Enter “=CHOOSE(B1, 10,9,8,7,6,5,4,3,2,1)” without the quote marks into Column C
2. Copy the function in Step 1 down the column for the remainder of Column C.
3. Use Paste Special Values to copy the values in Column C over the function.
Note: The Choose Function uses an “index” scheme to decide how to replace the values. For example, the first index location #1 appears in the function argument list immediately after the cell B1, so a response value of 1 becomes 10, and so forth.

Friday, January 16, 2009

Free Online Computer Classes for Workforce Development

HP Learning center - free online classes
Free, online classes, available 24/7: www.hp.com/go/learningcenter

Microsoft Office and Adobe

· Adobe Photoshop CS2: advanced
· Microsoft® Excel 2007, advanced part 1: analyzing data
· Microsoft® Excel 2007, advanced part 2: charts and graphics
· Microsoft® Excel 2007: create a PivotTable
· Microsoft® Excel 2007: extreme
· Microsoft® Excel 2007: filter data
· Microsoft® Excel 2007: intermediate
· Microsoft® Excel 2007: link and unlink content between two workbooks
· Microsoft® Excel 2007: record a simple macro and edit it in VBA
· Microsoft® Excel 2007: take a tour of the interface and learn basic skills
· Microsoft® OneNote: creating and using notebooks
· Microsoft® OneNote: getting started
· Microsoft® PowerPoint 2007: create a new slide master
· Microsoft® PowerPoint 2007: customize the PowerPoint interface
· Microsoft® PowerPoint 2007: introduction
· Microsoft® Word 2007: advanced
· Microsoft® Word 2007: intermediate
· Microsoft® Word 2007: take a tour of special features
· Microsoft® Word 2007: take a tour of the Ribbon
· Microsoft® Word 2007: use the Track Changes feature


Operating systems

· Microsoft® Outlook 2007: tips and tricks
· Microsoft® Windows Vista Sidebar: adding gadgets
· Microsoft® Windows Vista advanced customization: back up the registry
· Microsoft® Windows Vista advanced customization: increase bandwidth for network and internet connections
· Microsoft® Windows Vista: advanced customization
· Microsoft® Windows Vista: find files using basic Search
· Microsoft® Windows Vista: troubleshooting and maintenance
· Microsoft® Windows Vista: tune up your PC
· Microsoft® Windows Vista: use Disk Cleanup


·Programming and web

· Adobe Dreamweaver CS3: introduction
· Adobe Flash CS3: beginning web animation
· Building your first web page UPDATED!
· Computer programming: introduction
· Intermediate website design
· JavaScript: web programming basics
· Microsoft® Access 2007: introduction
· Microsoft® Expression Web: building websites
· Microsoft® Project 2007: introduction


PC solutions

· Combating spam and spyware (with podcast)
· Evaluating desktop virtualization solutions
· Exploring and implementing Gobi and 3G technology
· Firewall basics (with podcast)
· HP Backup and Recovery Manager: schedule backups
· HP ProtectTools: security at your fingertips (quick lesson with podcast)
· Simple backup strategies with HP Backup and Recovery Manager (quick lesson with podcast)
· Simplify your IT infrastructure: reduce total cost of ownership
· Wireless networking with Bluetooth


Servers and storage solutions

· Disaster preparedness through virtualization (quick lesson)
· ERP: resource planning solutions
· Introduction to storage networks
· Linux 101: a beginner's guide
· Linux 201: administering Linux for users
· Linux 301: introduction to Linux system administration
· Network attached storage basics
· Protect your data: back up to tape, disk and the network
· Understanding Microsoft® Windows Server 2008
· Virtualize your infrastructure: deployment


Business skills

· Build your business identity with a new logo
· Color your business: develop a marketing color scheme
· Create marketing materials that align with your goals (quick lesson)
· Create your own marketing materials with free templates
· Funding a new small business
· Improve your personal networking skills
· Marketing writing tips: five mistakes to avoid (quick lesson)
· Meet the HP Smartphone: get connected, get more done (quick lesson)
· Promote your business with social networking
· Save money, be energy efficient
· Ten tips for printing better marketing materials in-house (quick lesson)


Computing and networking

· HP Backup and Recovery Manager: restore files
· HP Backup and Recovery Manager: schedule backups
· IT infrastructure and its challenges: outsource or hire? (quick lesson)
· Laptop PCs: basic troubleshooting and repair (quick lesson)
· Laptop PCs: troubleshooting wireless problems (quick lesson)
· Network administration best practices
· Networking 101
· Servers 101
· Simple backup strategies with HP Backup and Recovery Manager (quick lesson with podcast)
· Six steps to computer security (quick lesson)
· Graphic arts
· Color your business: develop a marketing color scheme
· Create marketing materials that align with your goals (quick lesson)
· Create your own marketing materials with free templates
· Disaster preparedness through virtualization (quick lesson)
· HP Backup and Recovery Manager: restore files
· HP Backup and Recovery Manager: schedule backups
· IT infrastructure and its challenges: outsource or hire? (quick lesson)
· Marketing writing tips: five mistakes to avoid (quick lesson)
· Microsoft® Excel 2007: create a PivotTable
· Microsoft® Excel 2007: filter data
· Microsoft® Excel 2007: link and unlink content between two workbooks
· Microsoft® Excel 2007: record a simple macro and edit it in VBA
· Microsoft® Excel 2007: take a tour of the interface and learn basic skills
· Microsoft® Windows Vista Sidebar: adding gadgets
· Microsoft® Windows Vista advanced customization: back up the registry
· Microsoft® Windows Vista advanced customization: increase bandwidth for network and internet connections
· Microsoft® Windows Vista: find files using basic Search
· Microsoft® Windows Vista: use Disk Cleanup
· Microsoft® Word 2007: take a tour of special features
· Microsoft® Word 2007: take a tour of the Ribbon
· Microsoft® Word 2007: use the Track Changes feature
· Save money, be energy efficient
· Simple backup strategies with HP Backup and Recovery Manager (quick lesson with podcast)
· Six steps to computer security (quick lesson)
· Ten tips for printing better marketing materials in-house (quick lesson)
· Wireless networking with Bluetooth (quick lesson)
· Writing a high-impact business plan

http://www.hp.com/go/learningcenter/

Friday, December 26, 2008

Dangers of Formalizing and Testing Economic Theory

Dangers of Formalizing and Testing Economic Theory

Ekelund and Hébert (1990), among other points, argue that over-formalization of economics using mathematical and statistical model building without the concomitant guiding vision of a problem to solve will be detrimental to the future of economic study. Are our present day scholars of economics well-equipped with tools, but bankrupt of original economic problems to research? It’s worth examining each of these major points in detail.

Ekelund and Hébert (1990) draw several conclusions about the future of economics. First, econometrics is emphasized in graduate study and its mastery is required for advancement in the profession. Second, econometrics is so deeply prevalent that further use should be weighed against its expediency and costs. Third, proponents of the advancement of the scientific nature of economics have not fully considered whether it is possible to achieve that goal. Fourth, those who question continued formalization rightly argue that economic theory is not totally verifiable because of the ethereal nature of human behavior and the extremely high costs of collecting relevant, associated data. Fifth, economic ideas have historically led or paralleled the development of the techniques to test them, rarely vice versa, and to reverse the order of this discovery process runs the risk of dividing modern economics into two disciplines: a branch of mathematics and a branch of the social sciences. Finally, it is not the blind analysis of data in the pursuit of an elusive truth that makes economics relevant, but the healthy discourse among those of varied points of view.

Ekelund and Hébert (1990) point out that the study of mathematics and econometrics form the basis of graduate economic curricula. University hiring policies and professional advancement within the discipline all require mastery and application of mathematical and empirical technique. Because this shift can be traced to before the second half of the 20th century, the authors’ view is that it heralds the future of direction of economics as being toward mathematics and away from the social sciences.

Is graduate training a rut from which no scholar can lift himself? Whether this assertion is true can be debated. Every great economist in history went far beyond his formal training, often breaking with tradition in arduous academic journeys. Clearly those who are trained in mathematical analysis will have a natural tendency to rely on those techniques, but to suggest that they will not employ verbal or graphical descriptions of relevant economic phenomenon is far-fetched. Smith, Ricardo, Mill, Marshall, Jevons, Walras, Veblen, and Keynes all pioneered new patterns of thinking and made original contributions to economic thought. Moreover, many of them did not even have the benefit of formal training in economics (often because the discipline did not formally exist.) So if the history of economics is any predictor of the future, one cannot prove that econometric digressions in graduate training alone will forever doom the discipline.

Ekelund and Hébert (1990) assert that econometrics and mathematics have permeated modern and microeconomic and macroeconomic theory and that deeper use of econometrics in the discipline of economics should be evaluated based on weighing the costs, the advantages and disadvantages. From the vantage point of scholarship, an analysis of the advantages and disadvantages of perpetuating mathematical techniques may be difficult to ascertain. Clearly, to the degree that it is possible to assess the future usefulness of present development of econometric technique, such surveying should take place. However, the historian of economic thought has the unique perspective of being able to see past digressions in the discipline and employ 20-20 hindsight on those events. My view is that the permeation of microeconomics and macroeconomics with mathematical techniques is not a cause for alarm as long as progress in theory and policy is being made.

Advocates of continued formalization of economic theory through mathematics argue that theories must be formalized and verified if economics as a discipline will achieve the completeness and be afforded the respect of a science. Ekelund and Hébert (1990) make the salient point that while this may be a necessary or an even important goal, its pursuit seems to have taken precedence over whether it is an achievable goal. The central theme in this portion of the debate is that economists do not get the respect they really deserve. Economic theories that have not been proven do not carry the weight of gospel and are difficult to translate into economic policy.

Should economics assume the rigidity of a formal science? This is an important question with far-reaching implications. Would it not be better to view it as Ekelund and Hébert (1990, p. 604) suggest, “a powerful, though somewhat imprecise behavioral science?” Even truths we hold dear in the physical sciences are not absolute nor without founding assumptions, on which the entire discipline rests. With this in mind, economic policy must be formulated whether or not economics is regarded as a precise and formal science. Imagine the result if the political economists simply had no recommendations for improving the post-mercantilistic economies because of the difficulty in collecting data or the incomplete testing of theories. Absurd!

Opponents of continued mathematization and formalization argued that economics is essentially a social science, and as such is subordinate to the whims of human behavior and subject to the inexactness and vagaries of small data samples and incomplete theories describing economic phenomenon. Furthermore, the authors of the text indicate that the critics of econometrics further argue that data in the necessary quantities and qualities to validate the economic theories in question can be obtained only at a prohibitive cost.

While it may be true that economics is essentially a social science, all great strides of progress in the past study of economics have had both theoretical and methodological components. In other words, the development of methodology to test theory was closely associated or developed in parallel to theory. For example, it was economists such as Edgeworth, et al, who developed descriptive statistical techniques powerful enough to serve as the foundation for modern statistics and econometrics. Cournot was one of the first to recognize the importance of mathematical tools in accurately encapsulating economic ideas to avoid the digressions into vague argumentation that could occur when economic analysis was explained only in literary form. One possible danger with widespread use of econometrics is the sheer size of the data collection for analysis – economic problems that do not lend themselves to this sort of analysis may be overlooked or ignored. On the other hand, Alfred Marshall resisted mathematical expression of economic ideas because he sought to preserve the accessibility of economics from the vantage point of the common man.

Ideas that fuel economic analysis run parallel with the development of theory. The value of econometric and mathematical formalization in testing ideas is beneficial as long as limits to the final productiveness and efficacy of such techniques are understood. Econometrics has the potential to bog down economic studies in the tactical analysis of obscure ambiguities in the data collected instead of moving forward with the business of strategic problem solving. Some of the mathematical techniques developed recently have not necessarily been powerful or far reaching and so they are highly subject to revision and replacement. Ekelund and Hébert (1990) hold the view that as long the focus of economic study is on mathematics and empiricism, there is a potential in the debate to divide the discipline of economics into two separate camps; (1) economics as a branch of behaviorial science; (2) econometrics as a branch of applied mathematics.

The danger of creating a new discipline that focuses on the development of econometric techniques may be real, but that result would hardly be detrimental or without precedent. Varied schools of thought have been come and gone throughout the history of economic study. It is the very diversity of ideas which form the basis of a healthy and interesting dialogue between economists. Without a variety of ideas and great minds behind those ideas, the study of economics would be far less interesting and probably less relevant. With regard to the rich history of debate in economics, what is the harm in allowing the current fascination with econometric analysis to run its course?

Finally, over-formalization of theory in economics using mathematical or empirical technique could be in danger of rendering economics boring or irrelevant. Mathematical formalization should not be abandoned, but it should not be the driving factor in economic thought. The use of mathematics should be critically accepted with a judicious measure of skepticism. The runaway developments in econometrics may be slowing as some signs point to a slowdown in the production of articles about econometrics appearing in economic journals. It is important that those prominent in economics remember that absolute answers in economics are elusive and that mathematical techniques cannot find truth; they can only raise confidence, lower confidence or reduce ambiguity in postulated ideas. Indeed, the quest for scientific legitimacy has put the future of economics as a field of heterodox discourse in question.

Of course, one should not think that the use of mathematics is ruining the discipline of economics. Hardly. The availability of the many fine minds who are conducting research is remarkable and should have the power to tremendously improve our ability to interfere with dysfunctional aspects of the economy and leave alone those elements that are working fine. From a historical perspective, the message of the present fascination with econometrics is that it will shift and evolve into new solutions in ways that no one, not even the most astute historian of economics, can anticipate.

What is the end purpose of economic inquiry? Should not it be formulating practical answers to the questions of economic policy? This is not a new debate. In fact, the issue of mathematical formalization of economics has continually been debated since Adam Smith wrote The Wealth of Nations. It is in some sense a rather unusual turn of events though. Those who wanted to formalize economic theory have made great strides, but it was those thinkers such as Smith, Walras, Marshall, and Keynes who could think outside the box and break new ground who pushed the discipline forward. Is not it ironic that in a day and age when increased formalization of economics through computers is possible that some economists would cry, “Too much?”

Ekelund and Hébert (1990) raise several questions that are not easily answered in their conclusion to the chapter on The Development of Mathematical and Empirical Economics. Are graduate students of economics being trained improperly? Is the field of economics squandering its human capital and enamored with technique development but reticent to apply those same techniques in a cost-effective way? Do economists have low self-esteem about be treated as less than scientists? Can economics achieve the empirical validity of a true science? Can the development of econometric techniques move too far ahead of economic inquiry? Are professors and researchers of economics killing it as a discipline?

My view is that regardless of the weaknesses in some of their rhetoric, Ekelund and Hébert (1990) are essentially correct that too much formalization and verification in a field of study that must address real world, real time problems is harmful. The logical end point of the study of economics, by its very nature, is to understand economic behavior, and make practical suggestions, albeit imprecise ones, within a meaningful time frame. Otherwise, the modern economics has little practical value and could be in danger of becoming self-perpetuating, incestuous field of study.

Reference

Ekelund, R. B., Jr., & Hébert, R. F. (1990). A history of economic theory and method (3rd ed.). New York: McGraw Hill.

Friday, November 7, 2008

Mastering Basic Statistics

Trust me. You can do statistical analysis. The basics of statistics can be mastered... Forget about those mind-numbing textbooks for a second. Descriptive Statistics are all about what the data looks like. Inferential Statistics are all about whether two sets of data are different or if two sets of data have a relationship.

Descriptive Statistics are a summary of what the sample data looks like, such as the measure of central tendency (e.g., mean for interval data) and measures of dispersion (e.b., standard deviation (SD) for interval data). Data that is dispersed about a mean like the bell-shape is normally distributed (i.e., 68.26%in 1 SD, 95.44% in 2 SD, 99.7% in 3 SD). The randomly drawn sample is best but rarely possible, so a non-random or convenience sample can be used with justification.

When compiling descriptive statistics, you need to know whether the sample data (i.e., level of measurement) is nominal (yes, no, or a label), ordinal (in some kind of order such as doneness of meat: rare, medium rare, medium, or well done) or a number that has order and the value means something (such as "that movie is an 8 on a scale of 10"). You also need to know the unit of analysis, such as the individual, group, organization, or society. Descriptive Statistics tell us what Inferential Statistics we can safely use to draw conclusions.

Inferential Statistics are how we make a decision about the POPULATION guided by what the Descriptive Statistics have told us about the SAMPLE data using probability theory. There are two types of decisions: Measures of Difference and Measures of Association. Measures of Difference (z, t, F, etc.) test differences between a number and sample, two samples or more than two samples. Measures of Association (r, correlation, regression) test whether variables move together and possibly whether there is some causal relationship. (Causal relationships are tricky to prove so be careful about saying X causes Y.)

When applying Inferential Statistics, the types of measures of difference or measures of association that can be used are governed by the level of measurement, the number of samples you are comparing, whether the sample is random/independent, and if the data is tightly dispersed about the mean like the normal distribution. When you are comparing samples, you have to make sure that the unit of analysis in each sample aligns with the other samples and your research question. (e.g., students in a classroom vs. a classroom of students, such as can a single student be judged by being in a particular class or should the particular class be judged by a single student.) Test statistics are calculated from sample data and critical values are looked up on a distribution (probability) table, and you compare these two in hypothesis testing. If you see a low p value, that is good.

All good quantitative research uses variations of the above instances to boil the research question down to a testable hypothesis for a large sample for descriptive, exploratory, or causal/experimental research. All good research articles explain how construct validity (i.e., theory or practical problem), external validity (i.e., how and why the sample was chosen), internal validity (i.e., why they think they saw is what they saw) and conclusion validity (i.e., how the descriptive and inferential statistics support our discussion) are achieved. A sample size of one in qualitative research might use ethnography, action research or other methods to build a case study or foundation for quantitative research.

That's it. That's about all a business manager or MBA must know about statistics. Of course, there is a lot more that you could know, but the basics can be mastered.

Monday, November 3, 2008

Broadcast News Media Research Indicated Bias for Senator Barack Obama

A broadcast news media study by the Center for Media and Public Affairs at George Mason University found that coverage during the 2008 U.S. Presidential campaign of Senator Barack Obama was 65% positive, but coverage of Senator John McCain was only 36% positive.

According to the study by researchers at George Mason University, there was a documented media bias for Obama and against McCain. Did the bias influence voters? I don't know. Was the study relevant news that was largely ignored. I don't know.

The link below has the details. Judge for yourselves...

Source: http://www.cmpa.com/media_room_press_10_30_08.htm

Additional References: Pew Charitable Trust Study of Print Media: "The media coverage of the race for president has not so much cast Barack Obama in a favorable light as it has portrayed John McCain in a substantially negative one, according to a new study of the media since the two national political conventions ended."

Source: http://journalism.org/node/13307

References

The Center for Media and Public Affairs at George Mason University, http://www.cmpa.com/media_room_press_10_30_08.htm

Pew Charitable Trusts Excellence in Journalism, http://journalism.org/node/13307

Thursday, October 23, 2008

Unwelcome Effects of Public Opinion Research

Unwelcome Effects of Public Opinion Research

The importance of the public opinion survey / poll has gained prominence in presidential races, because of the economy and efficiency of mass opinion polling over the telephone and the Internet. For example, with a relatively small sample size of just under 400 randomly selected participants one can gain a reasonable understanding of the opinions of up to 1,000,000 persons, within a margin of error. The miracle of statistical inference.

A sample of approximately 1,500 randomly drawn individuals may be projectable across the entire nation. The implications are clear. An unscrupulous candidate, who strongly desires to be elected, may communicate only those messages that increase his/her favorable ratings in the polls. On the other hand, a candidate with integrity may use the pollster to determine those messages springing from his/her political ideology that need fine tuning to appeal to the largest group of voters.

Appealing to the largest group of voters is similar in concept to the responsiveness that all politicians in a majoritarian form of democracy must face. Public opinion polling should be used only by politicians and news organizations to gain a better understanding of their audience, but polls alone should not be considered news and should not be reported in a way that will shape public opinion. Is that too much to ask? Is that unrealistic? Perhaps.

In sum, honest and disingenuous politicians alike, and news organizations with a specific agenda, may find the pollster an indispensable member of the team, but their is a societal cost.

Reference

Janda, K., Berry, J.M., & Goldman, J. (1995) The challenge of democracy: Government in America, (4th Ed.). Boston, MA: Houghton Mifflin.