January 8, 2012


The major cause of global warming is carbon dioxide (Santer, 1995) or CO2 for short. CO2 is a byproduct of the industrial revolution (Romm, 2010;, 2006). Vehicles and coal burning power plants produce the most CO2, 1.5 and 2.5 billion tons respectively (NRDC, 2011). The fact that half of the nation’s electricity currently comes from coal, notably the dirtiest source of energy, is not controversial. While talk of “clean” coal processes circles the printed media, the truth is that the technology is not ready for prime time adoption (U.S. News, 2010).

As a typical American, I do my part to contribute to global pollution by driving a gas guzzling SUV, and demanding plenty of electricity from my regional coal-fired power plant, but I wish that greener choices for fuel and power were readily available to me. The solar industry is expanding, and offers the promise that it can compete somewhat with the fossil fuels in the power market. Costs are comparatively high however and certainly present an immediate barrier to an average consumer, in comparison to traditional petroleum-based energy that requires no up-front investment.

Managers of the Department of Energy (DOE) Solar Energy Technology Program acknowledge that the costs of solar systems are still thirty or forty percent higher than traditional sources (U.S. News, 2010). The DOE is attempting to lower prices for the adoption of renewable resources by 2015. The Obama Administration is working on several sustainable energy initiatives within the $80B stimulus plan for clean energy ventures (Mulrine, 2010) with $3.4 billion of funding slated toward the development of the Smart Grid (Kingsbury, 2010).

In the article “A Solar Grand Plan”, published by Scientific American Magazine in 2008, the authors suggested solar power could halt greenhouse gas emissions (GHG) from fossil fuel and end American dependence on foreign petroleum by 2050. The plan would establish an extensive photovoltaic cell network in the Southwest which would collect solar energy during the day, store it underground, and provide power during the night. Their proposal estimated that the project would deliver almost seventy percent of the electricity needs for the entire country, but would require $420 billion in government subsidies to fund. (Giberson, 2009). Giberson verified the data and estimated the potential:

“Three square miles yields a 280 MW capacity plant. Using the 70,000 homes number, a little calculation gives a 38 percent capacity factor for the plant, so that implies the plant would produce about 932,000 MHz per year. If all of the state’s electric power needs were generated using similar technologies and assuming constant economies of scale, it would take about 236 square miles (or about 1.2 percent of the land within the state) to accommodate the necessary solar power plants.”

Former Arizona governor Janet Napolitano seems to agree; she is quoted as saying: “There is no reason that Arizona should not be the Persian Gulf of solar energy.”

Some solutions to global warming involve indirectly reducing the demand for electricity and gasoline. Energy efficient infrastructure materials reduce the reflectivity, conductivity and emissivity impacts of building materials and pavements. High surface temperatures of concrete and asphalt are major contributors to an effect known as the Urban Heat Island; a condition where the excess heat created by streets and buildings results in higher energy and water consumption.

By reducing the albedo, or reflectance coefficient of materials like shingles and asphalt that absorb a great deal of radiation from the sun, engineers can affect maximum temperatures of habitats during the day. By reducing the emissivity, or surface radiation coefficient of materials like concrete, they can affect how heat transfers at night, which in some ways is the more important factor for cooling off an entire city.

Since each higher degree of heat raises peak A/C demand by roughly three percent, reducing heat-islands reduces energy consumption requirements and minimizes impact on the microclimate. “Both albedo and emissivity have positive responses in the reduction of pavement temperatures, both maximum and minimum even though their effects on each temperature are different.” (Gui, et. al., 2007).

Even the production of traditional concrete creates GHG emissions, as well as pollution and particulates. New pervious concretes having superior thermophysical properties and paints with superior temperature gradient slopes have been proven to positively impact the Urban Heat Island Effect when used to cool roofs and pavements; they can also increase vehicle gas mileage.

As a back-up or emergency solution to the problem, some scientists including the National Academy of Sciences (NPR, 2010), are promoting increased research into geoengineering. Geoengineering, which could more accurately be termed climate engineering, can be defined as manipulating the atmosphere to fix climate change after it has happened rather than attempting to control GHG emissions prior to their production (Economist, 2010). The Technology, Entertainment, Design (TED) organization contributors are one group that is supporting geoengineering research for the purpose of slowing global warming, just in case other attempts to control the atmosphere underperform.

Many people have concerns over practicality, side effects and drawbacks. Three of the popular ideas being discussed for artificially re-adjusting nature’s equilibrium are using giant mirrors in space that create a sunscreen; dumping iron biomass into the oceans to increase phytoplankton; and dusting the stratosphere with some kind of particles or chemicals to block or reflect sunlight. In his book ‘Hack the Planet’, Author Eli Kinitisch wonders how counterproductive blocking sunlight would be for solar energy systems; in essence, undermining technology that repairs the cause of the problem, which is our dependence on fossil fuels.

Will geoengineering ideas work? Most people seem to think it is much too early to tell. Yet with every positive development in the renewable energy production technologies, the global population comes closer to sustainable existence on Earth. The shift away from unsustainable energy sources will generate financial pressures. Even if western nations are able to reduce demand for petroleum, emerging markets in the eastern hemisphere may actually be offsetting the worldwide balance by increasing demand for petroleum. Money will need to be spent in the right places: renewable resources, high tech materials, smarter buildings and neighborhoods. Cooperation with other nations, like the joint agreement with China to reduce greenhouse gas emissions (GHG) (Kucera, 2010) is imperative; but the US government should be communicating a clearly thought out master plan, be it carbon tax, demand side management, tree planting, or other not yet conceived solutions. (Blackwell, 2011).

Works cited: Kyoto Protocol – Should the United States Ratify the Kyoto Protocol? Retrieved 31 Mar 2006.

Blackwell, Richard. Temp green policies leave businesses in limbo. Thursday’s Globe and Mail. 23 Mar 2011. Retrieved from LinkedIn 23 Mar 2011.

Calhoun, Paul. Can the USA Be Fossil Fuel Independent by 2050? EzineArticles: News and Society: Environmental. 29 Feb 2008. Retrieved 22 Apr 2010.

Economist. Lift-off. Science & technology. 04 Nov. 2010. Retrieved 26 Mar 2011.

Giberson, Michael. The key to Arizona’s energy future. Knowledge Problem: Commentary on Economics, Information and Human Action. 12 Nov 2009. Retrieved 22 Apr 2010.

Gui, Jooseng; et. al. Impact of Pavement Thermophysical Properties on Surface Temperatures. Journal of Materials in Civil Engineering: ASCE. August 2007. Pg. 689.

Kingsbury, Alex. A National Power Grid That Thinks. U.S. News & World Report magazine. April 2010. Pg. 37.

Kucera, Joshua. Side by Side in Need for Green Growth: China and America try cooperation. U.S. News & World Report magazine. April 2010. Pg. 42.

McKinsey. Greenhouse Gas Emissions Executive Summary pdf. 2007. Retrieved 22 Apr 2010.

Mulrine, Anna. Stuck in the Money Pipelines. U.S. News & World Report magazine. April 2010. Pg. 30.

National Resources Defense Council. Global Warming Basics. 2011. Retrieved 23 Mar 2011.

NPR Staff. Geoengineering: ‘A Bad Idea Whose Time Has Come.’ NPR News: Science. 29 May 2010. Retrieved 27 Mar 2011.

Romm, Joseph. Big Oil Keeps Blowing Smoke. U.S. News & World Report magazine. April 2010. Pg. 24.

Santer, Benjamin. Human Effects on Global Warming. US Dept of Energy Office of Science. 1995. Retrieved 23 Mar 2011.

U.S. News & World Report magazine. Progress Report: The Powers That Be. April 2010. Pg. 26.

Zweibel, Ken; Mason, James and Fthenakis, Vasilis. A Grand Solar Plan. Scientific American. 16 Dec 2007. Retrieved 25 Apr 2010.

Communication Technology

January 5, 2012

Technology continues to develop from the needs of society, and evolves into a form so as to influence society in ways never planned nor imagined.

The roots of communication technology are based on the work of a three men, a Scotsman, James Clerk Maxwell, who demonstrated that light was an electromagnetic wave and predicted on theoretical grounds that similar waves of different frequencies (either higher or lower than light) could be generated by electric discharges (sparks); a German physicist, Heinrich Hertz, who created an apparatus to generate and measure both high frequency (a few centimeters between each wave crest) and low frequency waves (a few meters in size) from a device called a spark gap transmitter; and an Italian, Guglielmo Marconi, who while studying electricity read a description of Hertz’s apparatus and noticed a feature of the apparatus that had apparently escaped Hertz’s attention.

Marconi succeeded in sending signals in Morse code as far as two miles. He also developed a simple antenna apparatus that would receive wave signals and convert them into direct current so that they could be heard by someone listening to the pattern of current through earphones (like telegraph signals). Marconi was not the only person experimenting with “wireless telegraphy” at the time, but he was the person who figured out a way to make money with it. He identified ship – to – ship and ship – to – shore communications as likely markets for his invention, reasoning that these markets that could not be possibly be served by the existing telegraph and telephone systems that depended on wires.

In 1901, Lee DeForest made a patent application for a detecting device he called a responder; then he invented and patented a triode, which he termed the audion. The audion was a modification of the diode, a device invented by John Ambrose Fleming. DeForest’s business and personal affairs were in a perilous state for several years and commanded most of his attention. In 1912, at the height of his financial problems, he sold the patent rights for the audion to AT&T.

Reginald Fessenden invented devices in 1901 and 1902. In 1901, he designed a new kind of receiver, the heterodyne receiver, which could convert high frequency waves produced by spark –gap transmitters into low frequency waves such as the kind that make diaphragms resonate in telephones. Fessenden took out a patent and went into to business for himself (and several investors) as the National Electric Company. By 1902, he had designed a high speed spark transmitter, called an alternator, which would produce sparks so fast that the waves it created were almost continuous. (Voice transmission requires continuous waves; Morse code transmission involves intermittent waves) He contracted with the General Electric Company to build it. Using equipment that would otherwise be sending Morse code, Fessenden had succeeded in sending the first “radio” messages.

Edwin H. Armstrong made two substantial discoveries. He discovered that the audion (vacuum tube) could amplify sounds if the current coming off of the plate was fed back into the grid of the tube; and that under the right circumstances the same “regenerative circuit” could transform a tube into a transmitter. He ended up selling his rights to regenerative circuits to the Telefunken Company of Germany right as WWI was starting because neither AT&T nor Marconi were interested. Imagine the political implication of this sale. The British were already nervous about the Kaiser’s investments and interest in electronics, so for an American to sort of break ranks and do as he pleased in a free market economy must have been a bit of a political zinger. This action by Armstrong appears to be a brutal representation of capitalism trumping anything in its path, and I suspect the sale appeared to England to be a slight of U.S. neutrality, not just an ill-mannered move by a socially clueless, solitary American. Great Britain had been the premier monopolist of the nineteenth century and the West’s international banker, yet having a complimentary relationship with the U.S. did not (could not) protect England from the activity of individual citizens regardless of whether that activity would be damaging to a friendly country. Surely the English were not prepared for what happened, and a little miffed about international relations impropriety. This kind of cross-pollination of business with Germany, including later with the Third Reich, was not restricted to Armstrong or Telefunken however, it happened a number of times. As Jefferson said over 100 years earlier, merchants have no country.

It is difficult to measure to what extent the various parties were impacted by Armstrong’s sale. U.S. National Security Agency surveillance dates back to Signal Corp intercept tactics of 1914. As the war went on, Telefunken passed significant wartime traffic, and wireless technology played a central role in military communications. American espionage located Telefunken engineers by triangulation in Mexico City in 1917, operating against the U.S. The same year, under the Alien Property Custodian provisions, Telefunken’s American patent portfolio was seized and sold to GE for $1,500. Meanwhile Armstrong kept his foot on the gas, by the end of the war he built an entire radio (the superhetrodyne) within a single box, what an audiophile today would call a receiver, which could tune in a signal and amplify the sound. This he promptly sold to Westinghouse. Westinghouse later joined the RCA consortium that included GE and AT&T.

No one has succeeded for very long controlling the American market for electronic devices, electronic components or even the content of what is communicated by these devices. The pace of electronic change is very rapid. Partly because of the nature of electronics technology itself, and partly because of the nature of capitalism, no aspect of electronic communications has been monopolized. Roughly speaking, there are three reasons:

a)      Governmental regulations. Anti-trust, FCC…

b)      Newer inventions or individual patents that supersede existing technology, coming from many devoted electronics amateurs as well as experts, make for a high volume of innovations.

c)      The competitive possibilities that emerge from a free market economy, such as newer ideas for competition or market share. When new industrial and technical frontiers are opened, older companies suddenly discover that they cannot control areas that they once had expected to dominate.

For example, when post-war American producers were focusing on high profit military markets, several Japanese manufacturers were able to capture the higher volume consumer markets. By the 1980’s, American firms had ceased to be in control of the market for electronic components, the very market they had originally created. In the case of video tape, U.S. companies tended to concentrate on quick profits, not on long term outlook, and they abandoned work on VHS. Japanese companies including Toshiba spent the R&D money and ended up capturing the lucrative market for video recording.

Philosophy on creation-science

January 1, 2012

In McLean v. Arkansas board of Education, 1981, McLean et. al. successfully overturned Arkansas Act 590, that had required balanced treatment of creation-science whenever evolution science curriculum were presented in public schools. Dr. Michael Ruse was one of over a dozen witnesses for the plaintiff in this case that may be unique for having experts qualify these two subjects in testimony (

Larry Laudan criticizes the demarcation criteria used in the ruling, though ultimately agreeing with the decision that found the 590 statute in violation of the Establishment Clause of the U.S. Constitution. Laudan felt that the decision would have been much stronger if it had been built on better arguments, and he shares them in his Commentary: Science at the Bar-Causes for Concern.

The demarcation criteria proposed in Judge Overton’s ruling focuses on a point that creation-science is not science, and details definitions of the important terms ( Laudan writes that it is more accurate to define creation-science as science, and would have been a much stronger position to argue against creation-science on the grounds that it is bad science. Laudan’s ‘Concern’ is that neither Judge Overton nor Dr. Ruse have done a real service to the  Evolution camp in its long-term rivalry against attempts to advance the religious conviction that a mystical spirit created humankind, because Overton adopts Ruse’s criteria of true science that are either red herrings or embarrassing misrepresentations of what science really is. The criticisms of each criterion are as follows:

1-      Creationism makes no (falsifiable) assertions. Laudan clarifies that to the contrary, claims have been made, tested, and found to be false; and that the court was entirely silent on this, what would have been the best argument against creationism. Further, this framing places creationism outside the boundary of empirical confrontation. Laudan also notes that not all true science is testable in an isolated context.

2-      Creationists are supposedly unscientific because they refuse to adjust their doctrine to advances in evidence that affect the doctrine. Laudan remarks that creationists would probably not admit that they change anything in light of evidence, but they have changed some positions over the years; anyway there are examples of scientists in the past who committed dogmatically to their claims.

3-      That if it cannot be explained by the natural laws that we understand today, then it must be unscientific. Laudan states that Newtonian, Darwinian and plate-techtonic theories would all have qualified as unscientific by this standard.

These arguments leave multiple loopholes for creation-science, for one, because there is no respectable consensus within the pro science community on what activity totally constitutes a definition of science; and two, just because creationism is not science, does not mean that it must be religion. It would be better says Laudan, to simply compare the two doctrines head to head and observe that one is entirely robust, while the other is not. It is also important to maintain the integrity of the intellectual community, which this ruling does not do.

In a response to criticism, Ruse does not adequately address Laudan’s philosophical concerns, but he does mention some noteworthy points. For one thing, many of the believers in miracles are not going to “be swayed by any empirical facts.” Ruse also argues Laudan’s arguments would not have stood up in court because it is not illegal to teach bad science, while in fact, the McLean decision did stand up in the U.S. Supreme Court six years later in Edwards v. Aguillard, 1987.

“Some creationists responded to [the Edwards v. Aguillard] decision by refashioning “creation science” to avoid any explicit references to the Bible, to God, or to the beliefs of a particular religious sect. This version of creationism re-emerged as part of the “intelligent design” movement in the 1990s” (

15 Business Tech Tools for 2012

December 27, 2011

In 2011 Computer Economics, Inc. published a report on trends in technology that profiled organization IT solutions as investment strategies. The following is a statistical review of the major characteristics grouped into three categories: A) those experiencing the most investment activity, B) those with most interesting results, and C) those that are almost compulsory for doing business.

A) Group that is experiencing the most investment activity:

#1: ERP: The rate of investment in Enterprise Resource Planning pushes it to the top of this list of 15 technologies that businesses invest in even though it has the poorest risk to reward ratio. ERP strategies reach positive ROI and break even (BE) for about half of the companies that adopt them but total cost of ownership (TCO) frequently exceeds original budget estimates. This is a very mature business technology that remains a mandatory tool for large enterprises, but difficult to forecast as an expense. In comparison to other strategies, it must be considered high risk, and rewards in terms of ROI and BE are only classified as moderate.

#2: CRM: Customer Relationship Management strategies are currently experiencing high rates of investment. CRM has ROI and BE numbers similar to ERP, but CRM hits better TCO points because the actual costs of adoption meet original budget estimates for approximately 70% of the companies that invest. CRM can be classified as having moderate risk with moderate rewards.

#3: BI: Business Intelligence systems are experiencing very high rates of investment. BI systems have several capabilities but commonly use analysis tools to query internal databases and develop predictions for competition decisions. BI has equivalent TCO numbers to CRM, with slightly better BE points than most other technologies. BI can be classified as having only moderate risk with high rewards.

#4: Enterprise Collaboration: Identifying financial rewards in collaborations systems is a difficult proposition, but this has not slowed the rate of investment in these technologies. Enterprise Collaboration systems meet what could be referred to as the TCO standard for business technology, where actual costs are consistent with original budget estimates in 70% of cases. BE points are good, but ROI for this technology is lower with only a third of businesses getting the expected returns. Enterprise Collaboration systems should be classified as moderate in risk and moderate in financial reward.

#5: Mobile Applications: Less than half of businesses have adopted Mobile Apps, but Mobile Apps are one of only two technologies that have a higher pace of investment percentage than adoption percentage, signifying a very fast growth rate. Mobile Apps are positioned right in the center of the graph for risk and reward; a true bull’s-eye of moderation on both axis. This is not to be confused with average; the average position on the scatter chart for the whole group of technologies is closer to where the border between low and moderate risk intersects the border between moderate and high reward, so Mobile Apps are not quite as safe as the average strategy on this list.

B) Group with most interesting results:

#6: Unified Communications: UC can deliver any type of communication via real-time methods, i.e. chat, whiteboarding, voice, forwarding, video, etc., by combining a whole set of technologies into a consistent interface. UC’s value comes from its ability to integrate real time communications with delayed delivery communications, but it reaches the enterprise bottom line by integrating communications into the business process cluster. UC has great ROI numbers with two thirds of companies that adopt experiencing positive returns, putting it well into the high reward classification. Meanwhile risk is a little better than average, at low to moderate. Ultimately, and perhaps obviously, a typical UC solution from a provider such as Sprint for example, is more expensive than a traditional on-premise PBX system and voice package. But when the cost is weighed against significant gains in productivity, the scale tips toward adoption of these steadily improving technologies, which explains why the market is expanding and predicted (ABI Research) to “reach 2.3 billion by 2016.”

#7: Desktop Virtualization: Not to be confused with server virtualization (v12n), Desktop Virtualization is the arrangement where your hardware is on your desk, but most of “your” software is accessed over a network, or online. Desktop Virtualization systems are almost guaranteed to come in under budget, thus providing a great reward ratio. In most cases this type of operation also improves security, which reduces risk, giving an indirect bonus to the reward ratio as well.

#8: SaaS: Software as a Service has the best financial profile for low risk and high reward, of all technologies available. Its costs are very predictable, with 80% of businesses reporting that TCO met original budget estimates, and that nine out of ten businesses hit BE or saw positive ROI within two years.

#9: PaaS: Platform as a Service is a less mature technology than IaaS, or any other technology on this list. Almost no companies have implemented this true cloud environment, and few are considering doing so. However, risk of exceeding TCO estimates with PaaS is only moderate while rewards so far have been high.

#10: IaaS: Infrastructure as a Service is the purest form of the Cloud trinity. Companies are experiencing lower than average BE times, but predicting the TCO is easy and since ROI is acceptable, IaaS can be classified as low risk with high reward.

#11: SCM: The percentage of companies adopting Supply Chain Management is lower than I had expected. I forget that many industries have no use for either the planning or execution systems available within SCM. However, for the companies that have adopted it, which is about a third of the business economy, it has been a highly rewarding strategy because ROI for SCM is excellent, while risk is moderate. In the future, Cloud technologies should help with some of the challenges businesses currently face with implementing SCM strategies.

#12: Tablets: As a technology, tablets are economical but as a business strategy they are the second most expensive in terms of exceeding estimates for cost of ownership. Meanwhile reward has been measured as a flat line, not even getting off the floor.

#13: Legacy System Renewal: Not all companies have legacy systems, but as time marches on, the legacy renewal decision catches up to everyone. The question of whether to fix up existing equipment or to buy new can be a tough one to answer accurately. Upgrading to new equipment may seem like a no brainer, but it can be a gamble, take Microsoft Vista for example. Legacy system renewal is ultimately moderately risky, and moderately rewarding.

C) Group that is almost compulsory for doing business.

#14: Windows 7: Over three fourths of companies have already or plan to adopt Windows 7. It is one of those technologies that are almost a mandatory cost of doing business. Whether a large enterprise or mom and pop shop, Windows 7 gives at least moderate rewards with low risk.

#15: HRMS: Human Resource Management Systems are a very mature technology. HMRS has been around for a while and three quarters of businesses use some form of software to control employee information; obviously for large labor forces it is basically a necessity. Risk is in the low end of the moderate range and reward is in the high corner of the moderate range.

Internet Protocol Multimedia Subsystem

December 7, 2011

Huawei’s leadership role in IMS standards development and its own proprietary Softswitch. 2010. Foster, W., Reinsch, R. Published by Emerald Publishing Group. Journal of Chinese Management Studies.

Full text can be requested here:’s_leadership_role_in_IMS_standards_development_and_in_its_own_proprietary_Softswitch?fulltextDialog=true&ev=prf_pub_xaddfulltext

Pros and Cons of the Internet

December 5, 2011

Pros and Cons of the Internet.

     Pros of the internet are fairly well known and reported. Good information and quality content are in abundance on the web; all types of communication have been expanded and in many ways improved. The internet helps society by allowing us to order shoes that are not available in stores and have them shipped directly to any location faster than we could have even driven to the store, and to instantly send words of encouragement to our friends and family. This helps everyone but especially physically handicapped people and people with transportation challenges, and it saves people time and money.

The communications effects are similar to what happened when the printing press started cranking out pamphlets over five hundred years ago. Information that previously would not have reached end users became available to them, as it traveled greater distances and disseminated widely, it also traveled at a faster rate. Nowadays something that would probably never have received any exposure can “go viral” so fast that it becomes a problem for the concerned parties because they do not have time to prepare for the exposure. One interesting example of how even minor news items are broadcast I like to refer to is the USS George H.W. Bush aircraft carrier story:

The effects of new things are often a double edged sword. Computers transformed the whole jobs landscape in terms of what jobs were available, and the internet has transformed the jobs landscape in terms of where jobs are available. Demand for something must fall as a result of this shift in locations. The winners were people who were able to take advantage of the shift in locations, and wherever demand fell, the losers (like the United States Postal Service for instance) lost money, or prestige, or both. Today the USPS declared its surrender-to-decline by cutting the delivery time of First Class mail, lowering the standard that it set forty years ago.

As for data, bad and “illegal” types of content are also in abundance on the web, not just good stuff. Artists suffer from loss of copyright; advertising has another outlet; at the extreme in of the scale we find content that clearly hurts society, such as increased propaganda and purposeful misinformation. People tend to give credibility to almost all forms of printed information, but bad content on the internet can serve to expedite narrow mindedness. If a person simply goes out looking for what they want with a predetermined conclusion in mind, they can usually find it without having to look too hard. People can quickly locate sources for “facts” to support their opinion or position on any particular subject.

One particular con of the internet currently receiving much attention is the way that personal privacy can be invaded. In fact, people may not actually have much in the way of real privacy these days, not necessarily because of the internet, but the web does enhance your exposure if you use it without thinking about privacy. Earlier this week when I was in the middle of writing this piece, I read (on the internet) that Facebook was compelled by the Federal Trade Commission to do something about their privacy settings. That story totally supported the claim I was already making, and I didn’t even have to go looking for it, it just came to me automatically from a feed aggregator that I use.

In a few days my weekly news magazine will come to my mailbox via the USPS truck. I expect I will open the magazine and find a paper version of an article on that Facebook story. As far as I know, the USPS does not track what they put in my mailbox, so I guess reading my own paper magazine in the privacy of my own home is still an anonymous affair. Please do not suggest to the government that the USPS could track what I read and sell that information to advertisers, to regain some of the losses from the shift in demand away from their service.

Link to ‘EV Equipped’ Eco House | Japan for Sustainability

August 9, 2011

Japanese Housing Maker Launches ‘EV Equipped’ Eco House | Japan for Sustainability.

Link to ‘What environmentalists need to know about economics’

May 16, 2011

Environmental Economics: What environmentalists need to know about economics.

SWOT: What is the result?

September 16, 2010

Company stakeholders frequently need to make business decisions on whether to make supply chain adjustments, upgrade equipment, provide employee benefits, or a myriad of other situations, including dealing with declining sales. These company needs often fall somewhere along a spectrum that has two poles. At one end of the spectrum the change may be mandated, possibly via regulatory authority, and at the other end of the spectrum are changes described as ‘nice to have’. Since most of these changes have some ratio of benefit to risk, it behooves the organization to estimate as well as possible what that ratio is, and perform due diligence rather than just guess whether or not the results of a change will be good for business. Risks are often underlying or hidden from first glance; to a small business, errors in planning and judgment can be devastating.

Surprisingly, senior managers often fail to understand the extent of planning that is needed in order to fit a change into the overall strategy of the organization. They also fail to understand the difficulty in lining up the ideals of a change to meld with with the enterprise’s big picture. If a project has no real downside in practice then there is no problem, but the majority of undertakings do cost some quantity of resources, and those resources [cash, for example] can be limited. What if there are a dozen potential projects and only enough cash for one? In the same way that a single person decides which task they can accomplish during their lunch hour, an organization must identify the details of various objectives before they can select which ones to invest in. Enter the need for Strategic Planning Analysis.

By analyzing the advantages and disadvantages inherent to the company itself, and by studying the opportunities and hazards involved with a prospective project, stakeholders or managers can best protect the long term objectives of their companies. Many executives are tempted to shortcut this process, but the results can be very expensive and end up thwarting progress instead of advancing it. Simply purchasing a brand name ERP package and outsourcing the implementation to a local expert is one such shortcut (Schwalbe, 2006).

Successful companies use a structured process for analysis. Sony, Proctor & Gamble, 3M, DHL, Phizer, Boeing, DuPont, Metlife, to name a few, consistently use a calculating system to define the results that they want to measure or gauge. It includes benefits that they expect, articulations of their own ideas and key performance indicators and controls for timelines. One system used for project selection is the SWOT analysis: Strengths, Weaknesses, Opportunities and Threats. Many criteria are quantified in order to predict how well a development can help achieve the goals of the company including whether the people in the organization will back the project and whether the employees will be motivated to work together or carry out the tasks that will sustain the mission. Other metrics involve financial performance, such as whether the organization has enough capital, the opportunity cost of that capital, return on investment, etc. SWOT provides a coordinated method to manage the complexities of all the different options in project selection.

Link to forecast table including Symbian, Android, Rim, iOS and Windows

September 10, 2010