Archive for the ‘Uncategorized’ Category

Predictive Analysis: Large Penetration Vendors

March 24, 2014

Predictive Analysis: Large Penetration Vendors.

Business Marketing

December 1, 2012

Strategic Alliances In Publishing:
Case Study of Readymade Magazine
September 23, 2010

Reinsch, Russell C.


     ReadyMade maintains many strategic alliances that are beneficial partnerships for the magazine. The bi-monthly magazine has a small circulation and a target audience profile of Generation Ys who are interested in Do It Yourself activities. ReadyMade, an independently owned periodical founded in 2001, was sold to Meredith Corporation, publisher of many periodicals including Better Homes and Gardens and Parents magazine, in 2006. Unlike most print publications that have suffered from the double whammy of declining ad revenues and subscriptions in the social media/online publishing boom, their ad revenues grew from 2007 through 2009 and circulation rates were holding steady. This is a markedly better scenario than other, better known, publications are facing. 
     As with all periodicals that are sold at book stores, newsstands, chains and other retail venues, Ready Made uses a large distributor to handle all the relationships with the outlets that sell their print publication. ReadyMade has one large standing order with the distributor and the distributor is then responsible for marketing the shelf space for ReadyMade. This simplifies and streamlines the supply chain, distribution channels and other management issues between ReadyMade and the outlets that display and sell their magazine. The distributor, in return for an agreed upon commission (a percent of the newsstand sale price) is responsible for placing the magazine in the different venues and negotiates the number of issues purchased with the retailers. These issues include the specific display and position of the publication, special retail promotions related to the publication, the number of copies ordered (referred to as draw), etc.
     There are many elements involved in selling print magazines and the distributor for ReadyMade handles the retail outlets only. Another method for selling magazines is by subscription where a person pays up front to receive his publication either through the mail or through an electronic version. Many Daily newspapers and magazines have set up digital/online editions and subscriptions. Examples include the Wall Street Journal and the Smithsonian magazine. Library subscriptions can be handled either by the retail distributor or by developing another strategic alliance with a distributor who specializes in the library market.
     ReadyMade has developed a market strategy that appeals to their target audience. Among a plethora of social media marketing options, there is a blog, Facebook and Twitter accounts, RSS feeds, and a free ReadyMade account that will guarantee to deliver two different newsletters and special offers straight to the account holder’s email account. As with most publications, the majority of the revenue is garnered through advertising. And, following industry standards, advertising rates are dependent upon circulation numbers. Circulation numbers include all sales from retail outlets, both internet and bricks-and-mortar based.
     Strategic alliances are an important part of keeping ReadyMade financially viable. With a distributor handling their distribution channels, ReadyMade can focus on the editorial side of their business while the distributor focuses on the marketing aspects of selling the print magazines.

A comprehensive infographic guide to UX careers

November 10, 2012

A comprehensive infographic guide to UX careers.

November 10, 2012


August 3, 2012

Crowd funding (CF) activities can be divided into three basic types: 1, equity, where the company attracts investors via the sales of shares, 2, donation and reward, where lenders are basically contributing for goodwill, and 3, peer to peer lending (p2p), where the company goes into debt to its investors.

Some CF platforms specialize in certain types of companies, projects or fund raising activities, while others have no niche and allow CF for anything. 15 of the top 40 sites require fundraisers to meet a pre-stated funding goal before the raised funds will be released, in an ‘all or nothing’ distribution scheme. Four sites distribute any funds that are raised regardless of the project’s goals, and the borrower keeps any money raised; and five of the CF platforms use some form of hybrid structure where funds distribution can go either way. Sites that allow users to keep any funds raised are especially popular.

Equity sites in the UK have been active for years. While the law allowing equity CF in the US was passed this April, the SEC has 270 days from the date of passage to write the regulations, and un-accredited investors cannot participate in equity CF on US sites prior to these regulations. Here is a summary of the more important sites alphabetically for K – Q.

Kickstarter. CF Type: Donation. Niche: loosely defined. $-Distribution: All or nothing. Summary: recognized as the #1 player in their segment. Kickstarter does not provide ACH management.

Kiva. CF Type: p2p. Niche: projects in developing countries. $-Distribution: borrower keeps any money raised. Summary: Launched in 2005, claims to be first mover in microlending for entrepreneurial projects. Funding Circle also claims this. Either way, Kiva is an important competitor in p2p. CF Type: Equity. Niche: early stage angel capital for innovative technologies and life sciences. Summary: SEO is terrible.

Lendingclub. CF Type: p2p. Niche: personal and business loans. Summary: first to register with SEC and offer a secondary market for p2p loans, important competitor in the p2p space. Borrowers need $70K salary, 660 FICO, and clean record for 12 months prior. Lendingclub actually carries the notes. Nice stats page. There is no transparency between lenders and borrowers.

Medstarter. CF Type: Donation. Niche: Healthcare. $ Distribution: All or nothing. Summary: their niche is one specifically excluded on Kickstarter. In beta. Boring UX.

Microventures. CF Type: Equity & p2p. Niche: connecting angels with tech startups. Distribution: All or nothing. Summary: Registered broker-dealer; required to perform due diligence handling investor relations. Average raise is $150K. 4K investors, $4M in funded transactions (not necessarily a lot in comparison to some other sites).

OnSetStart. CF Type: Donation but positioned to do Equity. Niche: None. Distribution: All or nothing.   Summary: Project creators keep 100% ownership over their work. Tools for funding projects. Member of NCFA. Traffic is supposedly growing quickly but the site appears to have almost no activity. Rated online as very easy to use but site navigation is actually quite awkward.

Peerbackers. CF Type: Donations. Niche: None. $ Distribution: all or nothing with a twist; if project has not met its funding goal but the project owner can still deliver the promised rewards, then the amount raised will be released to them. Summary: interesting menu options for finding projects on the site. They have received good media attention.

Peoples VC. CF Type: Equity. Niche: Hard to tell. Summary: marketplace functionality, well designed calculator, strong integration tools, “Crowdvestor” education course. Peoples VC has some very sharp people on their team. One of the top US equity sites, also receiving good media attention. CF Type: Donations. Niche: scientific research. $-Distribution: All or nothing. Summary: small projects, median range $10-15K size. Average donation is $70 (comparable with Kickstarter). Killer graphics on website. In beta. Good media coverage.

Prosper. CF type: p2p. Niche: personal needs. Summary: investor oriented- site has a schedule with about 38 rates from AA to HR; Quick Invest feature; developer tools and data mining resources. Info on the procedures is broken down into categories and well presented. With $370 million in personal loans funded ( 3X more than Kickstarter ) they are one of the top CF sites.

Quirky. CF type: co-creation; kind of a fourth category of CF. Niche: inventors and nerds. Summary: the site evaluates product ideas, picks winners, manufactures those products, and sells them on and offsite, taking about a 2/3 cut of the revenue. Members earn money right through the site. Members vote on inventions and ideas with the most potential, while playing a legit pricing game fashioned along the lines of the Price is Right. Link layout and functional naming on the site is quirky, but graphics are good. Unique player in the co-creation segment, receives major media coverage, and maintaining several partnerships with over a dozen household name retailers.

Commercial functionality within LinkedIn

June 6, 2012

Executive Summary

LinkedIn is well balanced in financial terms, generating revenue through three categories of monetized solutions: recruiting, advertising, and subscriptions. Currently, each of the three monetized solution categories contributes fairly equally to the total revenue number. Accounts in the USA compose two thirds of total revenue, with international accounts making up the other third. LinkedIn also offers four categories of free products to its users, defined as: profiles; networking; information exchanges; and widgets for integration/APIs/mobile applications.

Three solution categories generate commerce

Recruiting (“Hiring Solutions”) grossed approximately $100 million in 2010. This category consists of job boards, talent locators, referral engines, a matching tool, plus a few other products. LinkedIn competes in this market with Monster, CareerBuilder, Indeed, and other businesses providing job search services. Posting a job opening on LinkedIn cost approximately $200 a month in 2009 [Walker 2009].

Advertising grossed approximately $80 million in 2010. Advertising options include pay per click (PPC) ads, targeted marketing windows, a recommendation function… LinkedIn is in direct competition with the broader marketing industry.

Subscriptions grossed $70 million in 2010. Subscriptions are primarily software products, including advanced intranet search filtering capability, an intranet search agent, statistical reporting on profile activities, and a handful of other business- and executive-oriented features.

Role of the LinkedIn site, within its industry

LinkedIn is compared to other prominent social networks in the USA in media articles, analyst reports, etc. although it is often excluded from such comparisons as well. LinkedIn owns the professional demographic however, acting more like a corporate blog that crosses national and business network platforms. LinkedIn’s users go to the true social networks for their social activity, differentiating LinkedIn as their professional network where they maintain higher levels of discretion with their connections and hold higher expectations of trust and security over their profile. While different than Facebook, Twitter and YouTube in terms of social activity it does compete with them in the ad revenue markets [Miller 2011, Vahl 2012].

LinkedIn is also a software company, albeit a non-traditional one in terms of vendor “lock-in.” LinkedIn could be considered slightly competitive to email providers with search capability such as Google or Yahoo, if it decided to expand its search capability outside of its own intranet.

What LinkedIn does

LinkedIn has a number of interesting products and features that allow users to network. Among them are Behance Portfolio Application, cardMunch for iPhone, Slide Share, dashboard analytics, and Groups. Groups are highly popular self-organized communities.

Points regarding functionality of the site

The majority of commercial activity on LinkedIn is generated by a minority of its users. Background research for this report indicates most people that use LinkedIn are not aware of many capabilities and products that are stated in the company’s 10K. This lack of awareness can be attributed to two factors. First, many LinkedIn products are neither promoted nor available for purchase through the website, only through “field sales organizations” of which there are three regional headquarters in the USA located in Chicago, New York, and San Francisco. The Field Sales Organizations perform offline sales operations, by calling directly on their customers.

Imagine a scenario where a new LinkedIn user builds a personal profile and is considering what else the site has to offer for his or her business. They may start looking around the website and checking out available products and considering upgrading to a paying customer. At what point are they able to see the existing inventory or selection of products that are not visible online? Never via LinkedIn; perhaps on YouTube! It would be the same as a car dealer that had half of their inventory on a separate lot, and allowed customers to walk around on the first lot with no idea there were other, possibly more attractive vehicles for sale on an exclusive lot somewhere else. We would think that the car lot was seriously “missing the boat” in terms of marketing their inventory.

The lack of awareness about LinkedIn products can also be attributed to the relative difficulty in locating tutorial information on what products are provided. LinkedIn user guides are only available in two places, on the Learning Center page of the LinkedIn website, and on YouTube. There are hundreds or maybe thousands more tutorials placed on YouTube by unaffiliated individuals that describe how to do something on LinkedIn than there are on LinkedIn itself.

Videos and information about LinkedIn’s products that are available on its own website are too hard to find. This explains why there are hundreds of questions from users asking how to do something in LinkedIn on the “Using LinkedIn Q&A” section. Users are asking for help on performing the simplest of tasks on the website. Answers are fielded by other LinkedIn users, with random latency. In other words, their question may be answered in less than five minutes, or possibly never, they cannot know. Where is the company itself in all this? These questions should be getting swiftly fielded by someone from the organization; this is a prime opportunity for a company representative to step in and be of assistance, possibly opening the door to selling advanced services thru a live chat window, or any other means of interaction with users. At a minimum, LinkedIn should take care to have a phone number prominently located on key pages, to facilitate their users who wish to make contact with the website itself for service questions, as The Ladders website does. Please see the accompanying Use-analysis of LinkedIn’s site functionality in Excel.

Part of the attractiveness of LinkedIn is its simplicity. The website has a classic web 2.0 appearance, but there are areas where too much simplicity or minimalism equates to a restriction for commerce. The “Upgrade to Job Seeker Premium” promotion “Unlock Salary Estimates for Jobs on LinkedIn” has small generic bar graph with three unmarked bars that sit statically in an ad box on the sidebar. If this space were used to show a demonstration of the product, with active screens inside the box instead of just a random graphic that has no informational value, LinkedIn users would be able to easily gain information about available products. Presentation slides could be positioned like any other advertisement in the sidebar, or quietly active on some portion of various pages, with an option for the user to turn the audio on when the demonstration catches their eye.

There are other similar situations throughout a user’s experience on the LinkedIn site that have a negative effect on the commercial functionality. Example: when a user clicks on something that is not a free product, they are immediately taken straight to a sales check-out page (a cash register) with a short list of about six features for sale, but the interested party has not been given an opportunity to see any information about these features. LinkedIn is egregiously missing the opportunity to build value in their products. Websites of other companies take advantage of these situations very effectively through chat windows that open up, or interactions with some type of avatar, offering to demonstrate products or answer questions for the buyer. LinkedIn could provide an option to see a pertinent video about the product that automatically loads with a big click to play triangle in plain view for the user to get a demonstration on the product they are interested in. Were this to happen at some point before the customer is forced to decide between buying the product without knowing its benefits, going elsewhere to look for information about what the product benefits are, or just rejecting the product offering altogether, the sale conversion rate for that product would be higher.

Looking forward: improvements for directly or indirectly improving commercial activity

Upgrade email.

Email is an important communication medium for LinkedIn’s demographic. LinkedIn may not endeavor to compete with Google or other email providers, but their users should not have to deal with overtly clunky methods for manipulating mail while they are in the LinkedIn email system.

For example, after reading a message, a user currently has to go back to the inbox just to select the next message in the inbox for viewing. This is an archaic extra step that has been eliminated by the more efficient email providers years ago. Basic capabilities for formatting text would be nice as well.

The future of search:

LinkedIn is clearly aware of the growing importance of mobile solutions. The increasing demand for mobile capabilities will have a direct effect on the type of search functions that result in commercial transactions. As reduction in traditional indexed search usage means shrinking market share for LinkedIn’s competitors that currently survive off of these products, the situations for LinkedIn to step in and find new markets must be appealing. Development of intellectual property should be directed toward the pin-point focused search paradigm, the way information is actually consumed on mobile. Search services of the pin-point nature for the mobile application return actual, direct answers to questions computationally, in the way Apple’s Siri does; not a search result list from which the user has to drill down further by choosing from the selection that was ranked by an algorithm. An example of a response to the question “What is the per capita income for the US, China, and Germany” is here: Interfaces should also be constructed on a control panel model where users stay on the same page, versus the elevator hierarchy structure.

Give premium users more information.

Subscribers would pay for information on how an employer subsequently ranks submitted applications for employment. LinkedIn performs a pre-ranking service for the employers already, scanning the candidates resume or profile for keywords, etc. before sending the employer a ranking based on the results of the character recognition process. The software heavy lifting has already done, and the cost already paid for as part of the job posting service the business has subscribed to. Giving candidates the option to purchase the reused information would result in additional revenue for LinkedIn.


Miller, M. 2011.

Vahl, Andrea. 2012?

Walker, M.

Is lowering your price the only effective strategy in a recession or bad economy? (No)

March 1, 2012

The first consideration for any price setting is to define the goal or objective of the company. Companies measure product success in a variety of different ways and not all pricing techniques are appropriate for every objective the firm is undertaking. The marketing manager needs to know which activity is going to be measured: ROI, as is most common,  or market share? The answer to this question makes a big difference on what price to set for a product even in a recession.

Other important questions also need to be answered. For example, how long has the company been in business and/or what is the age of the industry? What is the relationship of this company to the competition? Is it a service oriented business? Organizations usually gain more control over their marketing and sales scenarios or more power over their suppliers the longer they are in business. In most situations service organizations should avoid lowering prices. However, almost everything in business is relative to what is happening with serious competitors, including unforeseen competitors that might enter the market because barriers like capital expenditure are low. In certain cases, lowering prices may be the best option in order to retain market share or to counteract potential loss of existing business.

In most cases differentiation is a better strategy than discounting. Cutting price before building value is the most unsophisticated approach to selling; and just pushing prices down will not guarantee success. For one thing, competitors can match prices. Secondly, customer loyalty can only be achieved by differentiation, not by price. A manager needs to be able to protect the reputation of the brand while also maintaining the gross margin on the product; irresponsibly cutting the price can affect both the brand recognition and the gross margin very negatively.

A successful manager needs to continually perform research to gather and update data on what really determines value in the relevant industry. The manager can then leverage that knowledge to better understand the kind of strategies that are employed by different competitors within the industry. Value-based pricing is a technique used in tough economies or in times of recession. Value bases can focus on design, superior technology, unique features, or even custom distribution. “To understand the customer’s perception of the value of your product or service, look at more subjective criteria such as customer preferences, product benefits, convenience…” (Small Business Notes).

A small business can alleviate some of the burden of marketing by using a seasoned sales representative that works on commission. A seasoned sales rep can be the secret to getting penetration without having too low of a price because he or she already has established relationships in the industry, knowledge of the territory, and the competition.

Small Business Notes: Pricing – Value-Based Pricing. Retrieved November 17, 2010.

Kodak: switched from b2c to b2b.

January 17, 2012

Kodak made the front page of the Wall Street Journal last Thursday, with the Journal reporting that this American titan of the 20th century is now hoping to avoid bankruptcy by selling off their patent portfolio. For the past five years they have been able to hold back the bleeding by filing patent infringement lawsuits, (Spector, 2012) but that strategy is running dry. Kodak’s largest problem stems from failure to capitalize on their own innovation way back in 1975 [the digital camera] because they were reluctant to disrupt their existing revenue stream that came from film. Let’s take a look at how they attempted to deal with issues of market share and service customization over the last few years.

Until the world of photography was revolutionized with the advent of digital imaging, Kodak’s primary focus was on the consumer marketplace with a secondary focus on the business marketplace. As they lost market share in the consumer side of the business, Kodak focused on building its business products and services and successfully transformed itself into a business-to-business (b2b) powerhouse. In an On the Record podcast interview, communications consultant Eric Schwartzman declares “Over the last five years, Kodak’s revenue from consumer film has dropped from $15 billion to $200 million, but the company still has sales of $8 billion annually through a portfolio of new products, most of which are less than two years old and 80 percent of that revenue comes from business customers.” (Hayzlett, 2010)

Mark Weber was the Vice President of Worldwide Sales Partnership in Kodak’s Graphics Communications Group, and led the sales efforts for Kodak’s Digital Printing Solutions Group, a strategic business unit. In a 2008 video, Weber stated this business group was the fastest growing business in Kodak’s portfolio (Cengage, 2008). Weber delineated their specific marketplace for their products and services as the commercial printing industry, in addition to government and corporate businesses. The four main segments of products in their business portfolio were digital printing, consumables, workflow, and services.

In the video Weber describes the different approaches Kodak adopted as they transitioned into b2b. For example, Kodak had to change their sales model to a direct and indirect sales force model and adjust their customer touch points. Weber points out that services and solutions are the most difficult items to sell and that it is imperative to point out to potential clients not only the features but the benefits of their product and services offerings.

In a press release dated October 23, 2008, Weber is quoted as saying “Marketers and others who communicate with print continually strive to distinguish their materials and set themselves apart from their competition. With solutions that include using variable data printing for creating customized documents as part of an integrated, personalized campaign or producing a raised print that looks and feels like the item in the image, digital printing provides many opportunities to maximize communications effectiveness.” (Kodak Press Release, 2008)

Charles Lamb writes in his book MKTG2 that “An important issue in developing the service offering is whether to customize or standardize it… Instead of choosing to either standardize or customize a service, a firm may incorporate elements of both by adopting an emerging strategy called mass communication.” (Lamb, Hair & McDaniel, 2008) Weber states Kodak utilizes this mass communication strategy.

Kodak offers to help their customers grow their business with a full complement of services whether they be customized or standardized. Weber explains that some of the products and services are standard out of the box offerings. However their workflow product offering provides customized solutions and services which tie all of their capabilities together to both their commercial printers whether they are traditional or digital printing customers. In addition, Kodak’s web-to-print service offers some customization related to the regional and seasonal aspects of their customers printing business needs.

Customized services are also available as part of their packaging and transactional printing services. Weber describes the coupon printing capabilities Kodak provides to Papa John’s pizza where basically each coupon is customized to the specific consumer recipient. Finally Weber discusses Kodak’s outreach to their customer base through surveys and user group associations.

Kodak has been a household name for over century with their cameras and film. Some of the mistakes they made over the years are now classic corporate giant errors. As “Kodak teeters on the brink” of bankruptcy (Spector, 2012), the American icon is paying close attention to their customers so they can provide the best possible solutions and services, and escape a tragic end. Meanwhile business consultants everywhere are paying close attention to correlations between what Kodak does and whether they survive.


Cengage. 2011, March 19. Kodak – Services and Nonprofit Organization Marketing [Video file]. Retrieved from

Kodak Press Release. 2008. Kodak Experts Discuss Emerging Trends and Opportunities in Free Graph Expo Seminars. Retrieved March 21, 2011 from

Lamb, C., Hair, J. F. Jr., McDaniel, C. 2008. MKTG2. Mason, Ohio: Cengage Learning.

Hayzlett, Jeffrey. 2010. Consumer Film is Dead. But Kodak is Alive. Jeffrey Hayzlett Explains. Retrieved March 20, 2011 from

Spector, Mike. 2012, January 5. Kodak Teeters on the Brink. New York Times, p. A1.

Challenges to Science Philosophy and Theory

January 13, 2012

Table of Contents

Section 1 –

Introduction. pg. 3

Definition of terms. pg. 4

Background. pg. 5

Section 2 –

Philosophical problems for science in the 20th century. pg. 7

Demarcation: the line between what is science and what is not. pg. 8

Falsification and Induction. pg. 8

Section 3 –

Theoretic problems for science in the 20th century. pg. 9

Constructivism. pg. 10

Section 4 –

Solutions in Philosophy and Theory. pg. 12

Section 5 –

Conclusion. pg. 13

 Section 1 – Introduction

     Science and its methods suffered from a full spectrum of extremism in the 20th century. Scientists in the 1900’s operated with an overly austere view of what defined their discipline. The prevailing philosophy of the time, now regarded as the ‘empiricist’ philosophy, was principally represented by a group called the Vienna Circle. In the decades following the turn of the century science was forced to deal with attacks directed toward the scientific method and doubts about justifications for theories, which presented challenges to both the philosophy of science and the social interpretations of the discipline.

The rigid and restrictive grasp of the empiricists was gradually loosened by powerful theories put forth by philosophers that challenged conventional thinking about science, namely the theories championed by Karl Popper, W. V. Quine, and Thomas Kuhn. As recognition of the qualities in these theories gained adherents throughout the scientific fields, the pendulum of sentiment swung away from the strict views held by the Vienna Circle, more to a moderate position, and in some ways closer to the meta-physical principles of the older centuries, like those from Francis Bacon and Rene Descartes. (Descartes felt that even if everyone were to agree on something, like the Ptolemaic theory of the universe, it may still be a deception).

Eventually some of the looser practitioners focused so intently on the shortcomings of the scientific method and whether we should believe science provides true accounts of our world, they pushed the pendulum past the point of common sense, swinging beyond the center-point of balance and over correcting into the other extreme, a range where relativism, realism, and constructivism postulate much different assertions about science and theory.

The thesis of this essay maintains that humans can understand reality and conceive whether theories are adequate by using the best parts of science, which are sufficiently evidentiary. It allows for the belief that science is and can be empirically successful without automatically warranting the belief that truths of theories always have to be perfect.

Definition of Terms

Ampliative rules: Likely able to go beyond the given information; providing justification for the inferred conclusion.

Constructivism: The constructivist concept of rationality involves conscious analysis, and deliberate design of models or rules. The models classify individual behaviors in order to explain general behavior. It is neo-classical, but not inherently inconsistent or in opposition to Vernon Smith’s  ‘ecological’ form of rationality. The two are different ways of understanding behavior, that work together.

Empiricism: A benchmark era for science, the years around 1900, when hypotheses would only be accepted under austere circumstances, where the cold hard facts having been confirmed and verified through deductive testing, were thought to be objective observation, and involving universal laws of nature.

Falsification: Karl Popper suggested the demarcation line for science could be found through falsifying theories instead of trying to verify them. So scientific theories needed to contain something that you can actually dispute. The position “Cherry pie is good,” is not falsifiable.

Induction: Considered the biggest problem for finding scientific criterion for theory choice. The problem of induction pre-dates the 1800’s; it is deeply philosophical and tricky to comprehend. Technically, a cognitive process that includes statistical laws, or conditional probability. An interesting place to start when setting out to understand induction is “the Monty Hall” problem, where pigeons learn from experience in laboratory tests to switch doors, but humans do not.

Realism: an overly loose interpretation of intangible, unobservable things, to the extent that they are considered objective items of evidence in every case. Even if they are independent of accepted concepts, they still make for empirical theory and belief in them is still required for coherent science. In one version of realism, the success of science put forth as the proof of its objectivity. Science has not historically been so successful however, in fact, it has been the opposite.

Underdetermination: the Duhem-Quine (D-Q) theorem: D-Q has two components; 1) there are too many unknowns for evidence to be sufficient for us to identify what belief we should hold or what decision we should make between rival choices; theories must remain unsupported.                                                                            2) A small theory can never be isolated and tested by itself; if a small theory appears to fail a test, the entire corporate body, or the test, or the scientist must be called into question, not the small theory.

Background of Philosophy

     As described in the introduction, science held to an extremely narrow concept and rigid interpretation of scientific procedure at the beginning of the 1900’s. The indisputability of facts were paramount virtues of clear cut reasoning and exacting rationality. Only unmistakable evidence could be used in investigations to discover rules and laws. Laws for prediction and truth are what distinguished science, and the activities of science were above this line of demarcation. This overly strict philosophy hampered practitioner’s efforts to understand the world around them. Skeptics, and critics of empiricism claimed that the true nature of testing is limited, as theories do not ever find perfect “truths;” and that empiricism failed to detect this very deviation between itself and reality.

Background of Theory

     After the Renaissance, human knowledge developed to the point where it established itself as a full or authentic partner to reality. Humankind came to trust that any subject could be credibly understood if the activities of science and technology followed systematic discovery of evidence. Intellectual communities received increasing support, gradually replacing the old world way of using the senses as inputs and then haphazardly constructing a belief from there. In this way science and technology eventually became institutionalized in the twentieth century. At the apex of scientific heyday, The Vienna Circle permitted only the narrowest of definitions of what constituted a valuable hypothesis. Scientists or the layperson could accept them or not, there was no middle ground. Nor was there any need to postulate about hidden entities, the Circle did not want the rules of the universe to have to continue into an infinite string of explanations.

Popper advocated an innovative way to identify the products of science, and argued that scientific inferences do not use induction. His theory loosened up the structure of what constituted the infamous demarcation point.

Kuhn wrote that everything is relative to the culture or time period in which the circumstance exists, and that the one thing that we do know for sure is that science will be rewritten in the future. Kuhn proposed that the context of time breaks the line-of-decent model from old science as the foundation for newer science; that two different periods of science are not be comparable, and he acknowledged the existence of subjective elements within science.

From there we viewed science’s dependency on theory: that science can never escape its relationship with theory, because even the laws of science will change over time or at least be conceived differently from one society or another. From this outlook, science is dependent on theory as a set up or precursor for the scientific method. In light of this dependency, social scientists highlighted various troublesome issues in scientific elements, such as conflicting evidence, partial evidence, and weird evidence, and used these issues to critique the scientific method.

Larry Laudan proposed splitting the action of problem solving from the concept of the solution. In this perspective effective problem solving remains a rational activity, while what counts as a solution is allowed to be relative, and in this way Laudan found an answer to a major problem for determining acceptability of a theory.

Section 2 – Philosophical Problems for Science.

     The how-to component of justifying a belief is the most important epistemic problem for scientific investigation. It also happens to be equally problematic for induction. Science entered the 1900’s with a pre-existing problem of induction, stuck like a thorn in its side that it carried around for hundreds of years. David Hume explicated the problem in his mid-nineteenth century works, and it has been seen as the major obstacle for science ever since.

A second serious challenge for science surfaced as more attention turned to the fact that every theory or at least some parts of theories are eventually found to be inadequate or wrong.

In a third challenge, we came to face the fact that scientific methodology, like all contexts that involve humans as the practitioners, is an activity that works in ways that we do not exactly understand. Although the empiricists in the Vienna Circle attempted to deny it, science in practice involves social aspects that are subjective, and a general method for obtaining ‘correct’ conclusions through objective investigation will not always follow some universal recipe for getting to an explanation of the world. Every person has a unique set of principles, we can each look at the same data and come to different conclusions, and science has proven to be unable to escape this ‘problem.’

Demarcation: In order to establish a solid baseline for the reputation of scientific methods, the demarcation line stood as the separation between science’s concrete evidence and everything else below it for Rudolph Carnap, Carl G. Hempel, and the Vienna Circle. They were very committed to observation and measurements that could be used to formulate laws with predictive power, and it was these bullet-proof rules that were the backbone of their model of science. Empiricists were especially enamored with the predictive power of a rule or law.

Falsification and Induction: Popper’s solution for demarcation suggested we not worry about confirmation, and instead focus on falsifying a theory. Popper argued that since we are limited by finite sets of observations, anything can technically be confirmed using induction, though he did not feel induction was used in true scientific critique, only deduction. Unfortunately, we cannot simply deny that we use induction. Wesley Salmon writes that with Popper’s falsification, we would be stuck in a situation with infinite conjectures; and, according to Salmon, Popper’s ideas when closely examined contain circular runarounds. Summarized by Scott Scheall at Arizona State University: “we cannot use a conjecture’s degree of corroboration as a measure of its reasonableness as a basis for prediction.  To do so would be to let induction in through the back door and we would again be saddled with the problem of induction.  In other words, a conjecture’s degree of corroboration tells us how well it has performed with respect to past predictive tests, but it tells us nothing (logically) about how it will perform in future tests.”

Thus Popper’s falsification, and its contingent sub premises of conjecture and collaboration, and demarcation fail to detail a demarcation for science or formalize the scientific method much if any better than past attempts. Laudan brings final clarification to the discussion however, noting that we never have sufficient justification to need an assertion to be true in a perfect sense in order to accept it; justification for induction is simply not required.

Section 3 – Theoretical Problems for Science

     As described above, Popper proposed falsification as the solution to the problem of induction. The D-Q theory of underdetermination also shows falsification is not a work around for the problem of induction. D-Q declares the procedures one would use to falsify theories are ambiguous, and second, that we can only falsify an entire corporate body, not a single/small theory in isolation. Later theorists then expanded on weaknesses identified by D-Q, interpreting D-Q as showing that rules or “as-if” rationalities are impossible.

In his attempt to loosen the overly strict grip of empiricist philosophy on science and provide guidance when deciding on what theory to follow, Kuhn championed the idea that demarcation is only relevant within normal science, and what makes a theory scientific is the absence of debate over theories; hence only whenever critiques are silent, are we experiencing science. Kuhn saw two distinct periods of scientific activity, with the period of what he referred to as normal science making up the super-majority of time, and only during the very rare revolutionary periods would Popper’s falsification be useful for demarcation. He also saw any challenge to a theory as necessarily directed at the scientist, not at a paradigm itself. Kuhn agreed with D-Q in this respect, but whereas D-Q underdetermination considers paradigms as more or less static and permanent, for Kuhn, neither the standards of evaluation or conditions in the field are permanent, they are always changing.

Changing scientific evidence causes problems for anyone who wants to adhere to a particular theory. Imagine a person makes a decision to eat fish for the omega 3 acids that are good for the heart, or decides to exclude fish from the diet because of the mercury content based on the existing knowledge and theories on food science. To then hear of a new study that has determined that those same omega 3 acids are now apparently bad for the prostate, and trans fats that were thought to be bad for the heart, are what is good for your prostate, calls the whole paradigm of food science into question.

People operate with some kind of personal philosophy either to believe in no theory at all, or some theory in particular, and might at this point find themselves with a freezer full of fish that they no longer wish to eat, because science had decided “healthy eating may be a much more complicated matter than nutritionists previously realized.” (The Week, 2011).

The D-Q principles (that advise theories must remain unsupported), and Bacon’s analysis that almost nothing is a full treatment of a subject for everyone (and that there is no single question on which all people can agree on the answer), and various misinterpretations of the critiques of empiricism & Popper, combined and led to unlicensed promotions of constructivism, realism, or relativism by Bruno Latour, Paul Feyerbend, and several others.

Laudan corrects the D-Q/Kuhn inseparability-of-paradigm pyramid structure by replacing it with a web structure, and weakens DQ to the simulacrum of rendering it moot. Laudan also liberalized the standard view of paradigms as static systems; he explained that they are always comparative, subject to change, and dependent on circumstances of context. Determining whether certain criteria are more important than others is not a straightforward process, but we have no reason to consider unbalanced concepts like relativism while we still have common sense at our disposal. Laudan also clarifies how induction is really not such a big problem when ampliative rules of evidence can be incorporated.


     Constructivism runs into problems in social studies because social theories are composites; they put construct parts into wholes, and schemes of relationships that are interpretations, but they are not able to do more than that. The constructed models leave out some of the parts. They are schemes that connect distinct, single things by using relationships that we understand; they create wholes, but this does not make them factual. We report on them using terms like ‘New York City’, that do not have sharp, precise definitions, because they may have a variety of properties. For example, a problem for the prominent social science of economics is that it cannot distinguish how people go from a starting point and through practice in self regulated systems, to finding equilibrium in personal exchange without the use of consciously constructed models. The structured model does not predict the higher level of cooperation or reciprocity that takes place in the market. Studying behavior, we see people use their unconsciously learned experience when they need to make spontaneous moves; they dynamically figure out what car insurance to buy or how to evaluate university ranking matrices, either without, or together with the existing  instructions in the constructed schemes; so the schemes often have little legitimate purpose or are redundant.

Section 4 – Solutions in Philosophy and Theory

     Laudan fixed the problems introduced by loose interpretations of D-Q by clarifying that science is neither so static nor inseparable as D-Q posits, and he split the over used concept of “theory” into big and little theories; where the big ones function as tools, and the little constituents do the solving of problems. Thanks to Laudan’s perspective we have a quality picture of the formality of the scientific method and clarity on how we can choose between theories.

People need to be able to understand reality, and conceive whether theories are true and whether evidence is real. This can be more difficult if the particular subject of discussion/observation involves something as invisible the chains of bondage in the Stockholm Syndrome. At the opposing poles of an ongoing argument over whether to believe in invisible entities before they have been technically verified, realists and empiricists hold firm beliefs on when an unobservable can be considered real. Bas Van Fraassen gives agnostic discourse on particles that are too small to see and he notes that the best available explanation is often good enough as a representation of the truth; but most importantly, he recommends an approach of taking unobservables on case by case bases. Decisions on invisible particles and unobservables are important, when we consider situations involving forensic science testimony, DNA, and other evidence that jurors may not fully appreciate which have the power to put people in prison. Jurors are often expecting science to be responsible for solving the case, when in fact forensic evidence is occasionally found to be invalid (Begley, 2010).

In a 1998 scientific paper published by the esteemed medical journal The Lancet, author Andrew Wakefield linked the childhood vaccine MMR to an increased risk of autism in children. Thirteen years later, after much debate, scientific exploration and reexamination, and a plethora of class action lawsuits, the link has been discredited and the author vilified for both “bad science” and for perpetrating a fraud. But the damage caused by the claim is hard to undo. Despite scientific evidence to the contrary, many people still believe that childhood vaccination is a confirmed major cause of autism. While it is acknowledged that vaccines can, on rare occasions, cause severe side effects, the U.S. Institute of Medicine rejects the link between vaccination and autism.

Common sense dictates we not get hung up on distinction between truth and what is useful; we can commit to a level just short of literal truth and accept the concept of approximation as weak, but having a necessary value for scientific claims. The position for science to move forward is to just be the best at solving problems. Adequacy is fine for this; it is reliable and economic, like the neighborhood play at second base. Scientists can referee cognitive practices from this position and judge questions of when invisible entities are ok, because they can observe when entities are used in, or for good theories.

Section 5 – Conclusion

     Science is simply a belief, like religion. No one size fits all regulations or broad views work for the man on the street; life is not a carrot or stick situation. Science remains the best alternative we have for knowledge and description of the world, and the social aspects of scientific practice and concrete evidence are both factors for determining preferences. It we do not try to take either one too far, technology will continue to pull science into balance, and we might find we have both the carrot and the stick.

Tension remains between followers of the Darwinian doctrine and followers of religious doctrines because of differences on conceptual grounds. A young person may try to decide between Darwin and St. Peter; or between industrial progress and environmental protection. Are they to throw their hands up? No, they can understand reality and conceive whether theories are true and whether evidence is real, with help from empirically successful science and technology.


Begley, S. 2010. But It Works on TV! Forensic ‘science’ often isn’t. Newsweek: Science. Pg. 26.

Curd, M. & Cover, J. A. 1998. Philosophy of Science: The Central Issues. New York, Norton & Company.

The Week. (2011) Health scare of the week. News: Health & Science. The Week: The Best of the U.S. and International Media, pg 21.


January 8, 2012


The major cause of global warming is carbon dioxide (Santer, 1995) or CO2 for short. CO2 is a byproduct of the industrial revolution (Romm, 2010;, 2006). Vehicles and coal burning power plants produce the most CO2, 1.5 and 2.5 billion tons respectively (NRDC, 2011). The fact that half of the nation’s electricity currently comes from coal, notably the dirtiest source of energy, is not controversial. While talk of “clean” coal processes circles the printed media, the truth is that the technology is not ready for prime time adoption (U.S. News, 2010).

As a typical American, I do my part to contribute to global pollution by driving a gas guzzling SUV, and demanding plenty of electricity from my regional coal-fired power plant, but I wish that greener choices for fuel and power were readily available to me. The solar industry is expanding, and offers the promise that it can compete somewhat with the fossil fuels in the power market. Costs are comparatively high however and certainly present an immediate barrier to an average consumer, in comparison to traditional petroleum-based energy that requires no up-front investment.

Managers of the Department of Energy (DOE) Solar Energy Technology Program acknowledge that the costs of solar systems are still thirty or forty percent higher than traditional sources (U.S. News, 2010). The DOE is attempting to lower prices for the adoption of renewable resources by 2015. The Obama Administration is working on several sustainable energy initiatives within the $80B stimulus plan for clean energy ventures (Mulrine, 2010) with $3.4 billion of funding slated toward the development of the Smart Grid (Kingsbury, 2010).

In the article “A Solar Grand Plan”, published by Scientific American Magazine in 2008, the authors suggested solar power could halt greenhouse gas emissions (GHG) from fossil fuel and end American dependence on foreign petroleum by 2050. The plan would establish an extensive photovoltaic cell network in the Southwest which would collect solar energy during the day, store it underground, and provide power during the night. Their proposal estimated that the project would deliver almost seventy percent of the electricity needs for the entire country, but would require $420 billion in government subsidies to fund. (Giberson, 2009). Giberson verified the data and estimated the potential:

“Three square miles yields a 280 MW capacity plant. Using the 70,000 homes number, a little calculation gives a 38 percent capacity factor for the plant, so that implies the plant would produce about 932,000 MHz per year. If all of the state’s electric power needs were generated using similar technologies and assuming constant economies of scale, it would take about 236 square miles (or about 1.2 percent of the land within the state) to accommodate the necessary solar power plants.”

Former Arizona governor Janet Napolitano seems to agree; she is quoted as saying: “There is no reason that Arizona should not be the Persian Gulf of solar energy.”

Some solutions to global warming involve indirectly reducing the demand for electricity and gasoline. Energy efficient infrastructure materials reduce the reflectivity, conductivity and emissivity impacts of building materials and pavements. High surface temperatures of concrete and asphalt are major contributors to an effect known as the Urban Heat Island; a condition where the excess heat created by streets and buildings results in higher energy and water consumption.

By reducing the albedo, or reflectance coefficient of materials like shingles and asphalt that absorb a great deal of radiation from the sun, engineers can affect maximum temperatures of habitats during the day. By reducing the emissivity, or surface radiation coefficient of materials like concrete, they can affect how heat transfers at night, which in some ways is the more important factor for cooling off an entire city.

Since each higher degree of heat raises peak A/C demand by roughly three percent, reducing heat-islands reduces energy consumption requirements and minimizes impact on the microclimate. “Both albedo and emissivity have positive responses in the reduction of pavement temperatures, both maximum and minimum even though their effects on each temperature are different.” (Gui, et. al., 2007).

Even the production of traditional concrete creates GHG emissions, as well as pollution and particulates. New pervious concretes having superior thermophysical properties and paints with superior temperature gradient slopes have been proven to positively impact the Urban Heat Island Effect when used to cool roofs and pavements; they can also increase vehicle gas mileage.

As a back-up or emergency solution to the problem, some scientists including the National Academy of Sciences (NPR, 2010), are promoting increased research into geoengineering. Geoengineering, which could more accurately be termed climate engineering, can be defined as manipulating the atmosphere to fix climate change after it has happened rather than attempting to control GHG emissions prior to their production (Economist, 2010). The Technology, Entertainment, Design (TED) organization contributors are one group that is supporting geoengineering research for the purpose of slowing global warming, just in case other attempts to control the atmosphere underperform.

Many people have concerns over practicality, side effects and drawbacks. Three of the popular ideas being discussed for artificially re-adjusting nature’s equilibrium are using giant mirrors in space that create a sunscreen; dumping iron biomass into the oceans to increase phytoplankton; and dusting the stratosphere with some kind of particles or chemicals to block or reflect sunlight. In his book ‘Hack the Planet’, Author Eli Kinitisch wonders how counterproductive blocking sunlight would be for solar energy systems; in essence, undermining technology that repairs the cause of the problem, which is our dependence on fossil fuels.

Will geoengineering ideas work? Most people seem to think it is much too early to tell. Yet with every positive development in the renewable energy production technologies, the global population comes closer to sustainable existence on Earth. The shift away from unsustainable energy sources will generate financial pressures. Even if western nations are able to reduce demand for petroleum, emerging markets in the eastern hemisphere may actually be offsetting the worldwide balance by increasing demand for petroleum. Money will need to be spent in the right places: renewable resources, high tech materials, smarter buildings and neighborhoods. Cooperation with other nations, like the joint agreement with China to reduce greenhouse gas emissions (GHG) (Kucera, 2010) is imperative; but the US government should be communicating a clearly thought out master plan, be it carbon tax, demand side management, tree planting, or other not yet conceived solutions. (Blackwell, 2011).

Works cited: Kyoto Protocol – Should the United States Ratify the Kyoto Protocol? Retrieved 31 Mar 2006.

Blackwell, Richard. Temp green policies leave businesses in limbo. Thursday’s Globe and Mail. 23 Mar 2011. Retrieved from LinkedIn 23 Mar 2011.

Calhoun, Paul. Can the USA Be Fossil Fuel Independent by 2050? EzineArticles: News and Society: Environmental. 29 Feb 2008. Retrieved 22 Apr 2010.

Economist. Lift-off. Science & technology. 04 Nov. 2010. Retrieved 26 Mar 2011.

Giberson, Michael. The key to Arizona’s energy future. Knowledge Problem: Commentary on Economics, Information and Human Action. 12 Nov 2009. Retrieved 22 Apr 2010.

Gui, Jooseng; et. al. Impact of Pavement Thermophysical Properties on Surface Temperatures. Journal of Materials in Civil Engineering: ASCE. August 2007. Pg. 689.

Kingsbury, Alex. A National Power Grid That Thinks. U.S. News & World Report magazine. April 2010. Pg. 37.

Kucera, Joshua. Side by Side in Need for Green Growth: China and America try cooperation. U.S. News & World Report magazine. April 2010. Pg. 42.

McKinsey. Greenhouse Gas Emissions Executive Summary pdf. 2007. Retrieved 22 Apr 2010.

Mulrine, Anna. Stuck in the Money Pipelines. U.S. News & World Report magazine. April 2010. Pg. 30.

National Resources Defense Council. Global Warming Basics. 2011. Retrieved 23 Mar 2011.

NPR Staff. Geoengineering: ‘A Bad Idea Whose Time Has Come.’ NPR News: Science. 29 May 2010. Retrieved 27 Mar 2011.

Romm, Joseph. Big Oil Keeps Blowing Smoke. U.S. News & World Report magazine. April 2010. Pg. 24.

Santer, Benjamin. Human Effects on Global Warming. US Dept of Energy Office of Science. 1995. Retrieved 23 Mar 2011.

U.S. News & World Report magazine. Progress Report: The Powers That Be. April 2010. Pg. 26.

Zweibel, Ken; Mason, James and Fthenakis, Vasilis. A Grand Solar Plan. Scientific American. 16 Dec 2007. Retrieved 25 Apr 2010.