Approaches to project management have focused on the systems, procedures, and software put in place to determine progress and likely outcomes. These outcomes are usually expressed in terms of cost, schedule, and technical achievement against the project requirements and framing assumptions—the oft-cited three-legged stool of project management. What is often missing are measures related to human behavior within the project systems environment. In this article at AITS.org, I explore this oft ignored dimension.
Latest Blog Posts
It’s spring training time in sunny Florida, as well as other areas of the country with mild weather and baseball. For those of you new to the allusion, it comes from a poem by Franklin Pierce Adams and is also known as “Baseball’s Sad Lexicon”. Tinker, Evers, and Chance were the double play combination of the 1910 Chicago Cubs (shortstop, second base, and first base). Because of their effectiveness on the field these Cubs players were worthy opponents of the old New York Giants, for whom Adams was a fan, and who were the kings of baseball during most of the first fifth of a century of the modern era (1901-1922). That is, until they were suddenly overtaken by their crosstown rivals, the Yankees, who came to dominate baseball for the next 40 years, beginning with the arrival of Babe Ruth.
The analogy here is that the Cubs infielders, while individuals, didn’t think of their roles as completely separate. They had common goals and, in order to win on the field, needed to act as a unit. In the case of executing the double play, they were a very effective unit. So why do we have these dichotomies in information management when the goals are the same?
Much has been written both academically and commercially about Business Intelligence, Business Analytics, and Knowledge Discovery in Databases. I’ve surveyed the literature and for good and bad, and what I find is that these terms are thrown around, mostly by commercial firms in either information technology or consulting, all with the purpose of attempting to provide a discriminator for their technology or service. Many times the concepts are used interchangeably, or one is set up as a strawman to push an agenda or product. Thus, it seems some hard definitions are in order.
According to Technopedia:
Business Intelligence (BI) is the use of computing technologies for the identification, discovery and analysis of business data – like sales revenue, products, costs and incomes.
Business analytics (BA) refers to all the methods and techniques that are used by an organization to measure performance. Business analytics are made up of statistical methods that can be applied to a specific project, process or product. Business analytics can also be used to evaluate an entire company.
Knowledge Discover in Databases (KDD) is the process of discovering useful knowledge from a collection of data. This widely used data mining technique is a process that includes data preparation and selection, data cleansing, incorporating prior knowledge on data sets and interpreting accurate solutions from the observed results.
As with much of computing in its first phases, these functions were seen to be separate.
The perception of BI, based largely on the manner in which it has been implemented in its first incarnations, is viewed as a means of gathering data into relational data warehouses or data marts and then building out decision support systems. These methods have usually involved a great deal of overhead in both computing and personnel, since practical elements of gathering, sorting, and delivering data involved additional coding and highly structured user interfaces. The advantage of BI is its emphasis on integration. The disadvantage from the enterprise perspective, is that the method and mode of implementation is phlegmatic at best.
BA is BI’s younger cousin. Applications were developed and sold as “analytical tools” focused on a niche of data within the enterprise’s requirements. In this manner decision makers could avoid having to wait for the overarching and ponderous BI system to get to their needs, if ever. This led many companies to knit together specialized tools in so-called “best-of-breed” configurations to achieve some measure of integration across domains. Of course, given the plethora of innovative tools, much data import and reconciliation has had to be inserted into the process. Thus, the advantages of BA in the market have been to reward innovation and focus on the needs of the domain subject matter expert (SME). The disadvantages are the insertion of manual intervention in an automated process due to lack of integration, which is further exacerbated by so-called SMEs in data reconciliation–a form of rent seeking behavior that only rewards body shop consulting, unnecessarily driving up overhead. The panacea applied to this last disadvantage has been the adoption of non-proprietary XML schemas across entire industries that reduce both the overhead and data silos found in the BA market.
KDD is our both our oldster and youngster–grandpa and the grandson hanging out. It is a term that describes a necessary function of insight–allowing one to determine what the data tells us are needed for analytics rather than relying on a “canned” solution to determine how to approach a particular set of data. But it does so, oftentimes, using an older approach that predates BI, known as data mining. You will often find KDD linked to arguments in favor of flat file schemas, NoSQL (meaning flat non-relational databases), and free use of the term Big Data, which is becoming more meaningless each year that it is used, given Moore’s Law. The advantage of KDD is that it allows for surveying across datasets to pick up patterns and interrelationships within our systems that are otherwise unknown, particularly given the way in which the human mind can fool itself into reifying an invalid assumption. The disadvantage, of course, is that KDD will have us go backward in terms of identifying and categorizing data by employing Data Mining, which is an older concept from early in computing in which a team of data scientists and data managers develop solutions to identify, categorize, and use that data–manually doing what automation was designed to do. Understanding these limitations, companies focused on KDD have developed heuristics (cognitive computing) that identify patterns and possible linkages, removing a portion of the overhead associated with Data Mining.
Keep in mind that you never get anything for nothing–the Second Law of Thermodynamics ensures that energy must be borrowed from somewhere in order to produce something–and its corollaries place limits on expected efficiencies. While computing itself comes as close to providing us with Maxwell’s Demon as any technology, even in this case entropy is being realized elsewhere (in the software developer and the hardware manufacturing process), even though it is not fully apparent in the observed data processing.
Thus, manual effort must be expended somewhere along the way. In any sense, all of these methods are addressing the same problem–the conversion of data into information. It is information that people can consume, understand, place into context, and act upon.
As my colleague Dave Gordon has pointed out to me several times that there are also additional methods that have been developed across all of these methods to make our use of data more effective. These include more powerful APIs, the aforementioned cognitive computing, and searching based on the anticipated questions of the user as is used by search engines.
Technology, however, is moving very rapidly and so the lines between BI, BA and KDD are becoming blurred. Fourth generation technology that leverages API libraries to be agnostic to underlying data, and flexible and adaptive UI technology can provide a comprehensive systemic solution to bring together the goals of these approaches to data. With the ability to leverage internal relational database tools and flat schemas for non-relational databases, the application layer, which is oftentimes a barrier to delivery of information, becomes open as well, putting the SME back in the driver’s seat. Being able to integrate data across domain silos provide insight into systems behavior and performance not previously available with “canned” applications written to handle and display data a particular way, opening up knowledge discovery in the data.
What this means practically is that those organizations that are sensitive to these changes will understand the practical application of sunk cost when it comes to aging systems being provided by ponderous behemoths that lack agility in their ability to introduce more flexible, less costly, and lower overhead software technologies. It means that information management can be democratized within the organization among the essential consumers and decision makers.
Productivity and effectiveness are the goals.
The National Defense Industrial Association’s Integrated Program Management Division (NDIA IPMD) just had its quarterly meeting here in sunny Orlando where we braved the depths of sub-60 degrees F temperatures to start out each day.
For those not in the know, these meetings are an essential coming together of policy makers, subject matter experts, and private industry practitioners regarding the practical and mundane state-of-the-practice in complex project management, particularly focused on the concerns of the the federal government and the Department of Defense. The end result of these meetings is to publish white papers and recommendations regarding practice to support continuous process improvement and the practical application of project management practices–allowing for a cross-pollination of commercial and government lessons learned. This is also the intersection where innovation among the large and small are given an equal vetting and an opportunity to introduce new concepts and solutions. This is an idealized description, of course, and most of the petty personality conflicts, competition, and self-interest that plagues any group of individuals coming together under a common set of interests also plays out here. But generally the days are long and the workshops generally produce good products that become the de facto standard of practice in the industry. Furthermore the control that keeps the more ruthless personalities in check is the fact that, while it is a large market, the complex project management community tends to be a relatively small one, which reinforces professionalism.
The “blues” in this case is not so much borne of frustration or disappointment but, instead, from the long and intense days that the sessions offer. The biggest news from an IT project management and application perspective was twofold. The data stream used by the industry in sharing data in an open systems manner will be simplified. The other was the announcement that the technology used to communicate will move from XML to JSON.
Human readable formatting to Data-focused formatting. Under Kendall’s Better Buying Power 3.0 the goal of the Department of Defense (DoD) has been to incorporate better practices from private industry where they can be applied. I don’t see initiatives for greater efficiency and reduction of duplication going away in the new Administration, regardless of what a new initiative is called.
In case this is news to you, the federal government buys a lot of materials and end items–billions of dollars worth. Accountability must be put in place to ensure that the money is properly spent to acquire the things being purchased. Where technology is pushed and where there are no commercial equivalents that can be bought off the shelf, as in the systems purchased by the Department of Defense, there are measures of progress and performance (given that the contract is under a specification) that are submitted to the oversight agency in DoD. This is a lot of data and to be brutally frank the method and format of delivery has been somewhat chaotic, inefficient, and duplicative. The Department moved to address this by a somewhat modest requirement of open systems submission of an application-neutral XML file under the standards established by the UN/CEFACT XML organization. This was called the Integrated Program Management Report (IMPR). This move garnered some improvement where it has been applied, but contracts are long-term, so incorporating improvements though new contractual requirements tends to take time. Plus, there is always resistance to change. The Department is moving to accelerate addressing these inefficiencies in their data streams by eliminating the unnecessary overhead associated with specifications of formatting data for paper forms and dealing with data as, well, data. Great idea and bravo! The rub here is that in making the change, the Department has proposed dropping XML as the technology used to transfer data and move to JSON.
XML to JSON. Before I spark another techie argument about the relative merits of each, there are some basics to understand here. First, XML is a language, JSON is simply data exchange format. This means that XML is specifically designed to deal with hierarchical and structured data that can be queried and where validation and fidelity checks within the data are inherent in the technology. Furthermore, XML is known to scale while maintaining the integrity of the data, which is intended for use in relational databases. Furthermore, XML is hard to break. It is meant for editing and will maintain its structure and integrity afterward.
The counter argument encountered is that JSON is new! and uses fewer characters! (which usually turns out to be inconsequential), and people are talking about it for Big Data and NoSQL! (but this happened after the fact and the reason for shoehorning it this way is discussed below).
So does it matter? Yes and no. As a supplier specializing in delivering solutions that normalize and rationalize data across proprietary file structures and leverage database capabilities, I don’t care. I can adapt quickly and will have a proof-of-concept solution out within 30 days of receiving the schema.
To address JSON deficiencies relative to XML, a number of tools have been and are being developed to replicate the fidelity and reliability found in XML. Whether this is sufficient to be effective against a structured LANGUAGE is to be seen. Much of the overhead that technies complain about in XML is due to the native functionality related to the power it brings to the table. No doubt, a bicycle is simpler than a Formula One racer–and this is an apt comparison. Claiming “simpler” doesn’t pass the “So What?” test knowing the business processes involved. The technology needs to be fit to the solution. The purpose of data transmission using APIs is not only to make it easy to produce but for it to–you know–achieve the goals of normalization and rationalization so that it can be used on the receiving end which is where the consumer (which we usually consider to be the customer) sits.
At the end of the day the ability to scale and handle hierarchical, structured data will rely on the quality and strength of the schema and the tools that are published to enforce its fidelity and compliance. Otherwise consuming organizations will be receiving a dozen different proprietary JSON files, and that does not address the present chaos but simply adds to it. These issues were aired out during the meeting and it seems that everyone is aware of the risks and that they can be addressed. Furthermore, as the schema is socialized across solutions providers, it will be apparent early if the technology will be able handle the project performance data resulting from the development of a high performance aircraft or a U.S. Navy destroyer.
Atif Qureshi at Tasque, which I learned via Dave Gordon’s blog, went out to LinkedIn’s Project Management Community to ask for the latest tends in project management. You can find the raw responses to his inquiry at his blog here. What is interesting is that some of these latest trends are much like the old trends which, given continuity makes sense. But it is instructive to summarize the ones that came up most often. Note that while Mr. Qureshi was looking for ten trends, and taken together he definitely lists more than ten, there is a lot of overlap. In total the major issues seem to the five areas listed below.
a. Agile, its hybrids, and its practical application.
It should not surprise anyone that the latest buzzword is Agile. But what exactly is it in its present incarnation? There is a great deal of rising criticism, much of it valid, that it is a way for developers and software PMs to avoid accountability. Anyone ready Glen Alleman’s Herding Cat’s Blog is aware of the issues regarding #NoEstimates advocates. As a result, there are a number hybrid implementations of Agile that has Agile purists howling and non-purists adapting as they always do. From my observations, however, there is an Ur-Agile that is out there common to all good implementations and wrote about them previously in this blog back in 2015. Given the time, I think it useful to repeat it here.
The best articulation of Agile that I have read recently comes from Neil Killick, whom I have expressed some disagreement on the #NoEstimates debate and the more cultish aspects of Agile in past posts, but who published an excellent post back in July (2015) entitled “12 questions to find out: Are you doing Agile Software Development?”
Here are Neil’s questions:
- Do you want to do Agile Software Development? Yes – go to 2. No – GOODBYE.
- Is your team regularly reflecting on how to improve? Yes – go to 3. No – regularly meet with your team to reflect on how to improve, go to 2.
- Can you deliver shippable software frequently, at least every 2 weeks? Yes – go to 4. No – remove impediments to delivering a shippable increment every 2 weeks, go to 3.
- Do you work daily with your customer? Yes – go to 5. No – start working daily with your customer, go to 4.
- Do you consistently satisfy your customer? Yes – go to 6. No – find out why your customer isn’t happy, fix it, go to 5.
- Do you feel motivated? Yes – go to 7. No – work for someone who trusts and supports you, go to 2.
- Do you talk with your team and stakeholders every day? Yes – go to 8. No – start talking with your team and stakeholders every day, go to 7.
- Do you primarily measure progress with working software? Yes – go to 9. No – start measuring progress with working software, go to 8.
- Can you maintain pace of development indefinitely? Yes – go to 10. No – take on fewer things in next iteration, go to 9.
- Are you paying continuous attention to technical excellence and good design? Yes – go to 11. No – start paying continuous attention to technical excellent and good design, go to 10.
- Are you keeping things simple and maximising the amount of work not done? Yes – go to 12. No – start keeping things simple and writing as little code as possible to satisfy the customer, go to 11.
- Is your team self-organising? Yes – YOU’RE DOING AGILE SOFTWARE DEVELOPMENT!! No – don’t assign tasks to people and let the team figure out together how best to satisfy the customer, go to 12.
Note that even in software development based on Agile you are still “provid(ing) value by independently developing IP based on customer requirements.” Only you are doing it faster and more effectively.
With the possible exception of the “self-organizing” meme, I find that items through 11 are valid ways of identifying Agile. Given that the list says nothing about establishing closed-loop analysis of progress says nothing about estimates or the need to monitor progress, especially on complex projects. As a matter of fact one of the biggest impediments noted elsewhere in industry is the inability of Agile to scale. This limitations exists in its most simplistic form because Agile is fine in the development of well-defined limited COTS applications and smartphone applications. It doesn’t work so well when one is pushing technology while developing software, especially for a complex project involving hundreds of stakeholders. One other note–the unmentioned emphasis in Agile is technical performance measurement, since progress is based on satisfying customer requirements. TPM, when placed in the context of a world of limited resources, is the best measure of all.
b. The integration of new technology into PM and how to upload the existing PM corporate knowledge into that technology.
This is two sides of the same coin. There is always debate about the introduction of new technologies within an organization and this debate places in stark contrast the differences between risk aversion and risk management.
Project managers, especially in the complex project management environment of aerospace & defense tend, in general, to be a hardy lot. Consisting mostly of engineers they love to push the envelope on technology development. But there is also a stripe of engineers among them that do not apply this same approach of measured risk to their project management and business analysis system. When it comes to tracking progress, resource management, programmatic risk, and accountability they frequently enter the risk aversion mode–believing that the less eyes on what they do the more leeway they have in achieving the technical milestones. No doubt this is true in a world of unlimited time and resources, but that is not the world in which we live.
Aside from sub-optimized self-interest, the seeds of risk aversion come from the fact that many of the disciplines developed around performance management originated in the financial management community, and many organizations still come at project management efforts from perspective of the CFO organization. Such rice bowl mentality, however, works against both the project and the organization.
Much has been made of the wall of honor for those CIA officers that have given their lives for their country, which lies to the right of the Langley headquarters entrance. What has not gotten as much publicity is the verse inscribed on the wall to the left:
“And ye shall know the truth and the truth shall make you free.”
- John VIII-XXXII
In many ways those of us in the project management community apply this creed to the best of our ability to our day-to-day jobs, and it lies as the basis for all of the management improvement from Deming’s concept of continuous process improvement, through the application of Six Sigma and other management improvement methods. What is not part of this concept is that one will apply improvement only when a customer demands it, though they have asked politely for some time. The more information we have about what is happening in our systems, the better the project manager and the project team is armed with applying the expertise which qualified the individuals for their jobs to begin with.
When it comes to continual process improvement one does not need to wait to apply those technologies that will improve project management systems. As a senior management (and well-respected engineer) when I worked in Navy told me; “if my program managers are doing their job virtually every element should be in the yellow, for only then do I know that they are managing risk and pushing the technology.”
But there are some practical issues that all managers must consider when managing the risks in introducing new technology and determining how to bring that technology into existing business systems without completely disrupting the organization. This takes–good project management practices that, for information systems, includes good initial systems analysis, identification of those small portions of the organization ripe for initial entry in piloting, and a plan of data normalization and rationalization so that corporate knowledge is not lost. Adopting systems that support more open systems that militate against proprietary barriers also helps.
c. The intersection of project management and business analysis and its effects.
As data becomes more transparent through methods of normalization and rationalization–and the focus shifts from “tools” to the knowledge that can be derived from data–the clear separation that delineated project management from business analysis in line-and-staff organization becomes further blurred. Even within the project management discipline, the separation in categorization of schedule analysts from cost analysts from financial analyst are becoming impediments in fully exploiting the advantages in looking at all data that is captured and which affects project performance.
d. The manner of handling Big Data, business intelligence, and analytics that result.
Software technologies are rapidly developing that break the barriers of self-contained applications that perform one or two focused operations or a highly restricted group of operations that provide functionality focused on a single or limited set of business processes through high level languages that are hard-coded. These new technologies, as stated in the previous section, allow users to focus on access to data, making the interface between the user and the application highly adaptable and customizable. As these technologies are deployed against larger datasets that allow for integration of data across traditional line-and-staff organizations, they will provide insight that will garner businesses competitive advantages and productivity gains against their contemporaries. Because of these technologies, highly labor-intensive data mining and data engineering projects that were thought to be necessary to access Big Data will find themselves displaced as their cost and lack of agility is exposed. Internal or contracted out custom software development devoted along these same lines will also be displaced just as COTS has displaced the high overhead associated with these efforts in other areas. This is due to the fact that hardware and processes developments are constantly shifting the definition of “Big Data” to larger and larger datasets to the point where the term will soon have no practical meaning.
e. The role of the SME given all of the above.
The result of the trends regarding technology will be to put the subject matter expert back into the driver’s seat. Given adaptive technology and data–and a redefinition of the analyst’s role to a more expansive one–we will find that the ability to meet the needs of functionality and the user experience is almost immediate. Thus, when it comes to business and project management systems, the role of Agile, while these developments reinforce the characteristics that I outlined above are made real, the weakness of its applicability to more complex and technical projects is also revealed. It is technology that will reduce the risk associated with contract negotiation, processes, documentation, and planning. Walking away from these necessary components to project management obfuscates and avoids the hard facts that oftentimes must be addressed.
One final item that Mr. Qureshi mentions in a follow-up post–and which I have seen elsewhere in similar forums–concerns operational security. In deployment of new technologies a gatekeeper must be aware of whether that technology will not open the organization’s corporate knowledge to compromise. Given the greater and more integrated information and knowledge garnered by new technology, as good managers it is incumbent to ensure these improvements do not translate into undermining the organization.
What is the responsibility of those in high tech for ensuring that that their products are used in an ethical manner? That information management is a product of empiricism is self-evident. Project and business managers who would delude themselves by relying on invalid information usually find themselves facing hard reality in a most unpleasant manner. How do we separate out the fanciful from the real when information is flattened, highlighted by the issue, raised over the last year, of fake news? These are the issues that I address in my latest post at AITS.org. Please check it out.
It’s time to kick off my 2017 blogging activity and my readers have asked about my absence on this blog. Well because of the depth and research required by some of the issues that I consider essential, most of my blogging energy has been going to contributions to AITS.org. I strongly recommend that you check out the site if you haven’t already. A great deal of useful PM information and content can be found there–and they have a strong editorial staff so that what does get to publication is pretty well sourced. My next post on the site is scheduled for 25 January. I will link to it once it becomes available.
For those of us just getting back into the swing of things after the holidays, there were a number of interesting events that occurred during that time that I didn’t get a chance to note. Among these is that SecDef Ash Carter appeared (unfortunately a subscription wall) on an episode of Neil DeGrasse Tyson’s excellent show “StarTalk“, which appears on the National Geographic Channel.
Secretary Carter had some interesting things to say, among them are:
a. His mentors in science, many of whom were veterans of the Second World War, instilled in him the concept of public service and giving back to the country.
b. His experience under former SecDef Perry, when he was Assistant Secretary of Defense for International Security Policy, taught him that the DoD needed to be the “petri dish” for R&D in new technologies.
c. That the approach of the DoD has been to leverage the R&D into new technologies that can be leveraged from the international technology industry, given that there are many good ideas and developments that occur outside of the United States.
d. He encouraged more scientists to serve in the federal government and the Department of Defense, if even for a short while to get a perspective on how things work at that level.
e. He doesn’t see the biggest source of instability will necessarily be from nation states, but that small groups of individuals, given that destructive power is becoming portable, will be the emerging threat that his successor will face.
f. There imperative that the U.S. maintain its technological edge is essential in guaranteeing international stability and peace.
Secretary Carter’s comments, in particular, in realizing that the technology industry is an international one strikes a particular personal cord with me since my present vocation has caused me to introduce new capabilities in the U.S. market built from technologies that were developed by a close European ally. The synergy that this meeting of the minds has created has begun to have a positive impact on the small portion of the market that my firm inhabits, changing the way people do business and shifting the focus from “tools” as the source of information to data, and what the data suggests.
This is not to say that cooperation in the international technology market is not fraught with the same rocks and shoals found in any business area. But it is becoming increasingly apparent that new information technologies can be used as a means of evening the playing field because of the asymmetrical nature of information itself, which then lends itself to leverage given relatively small amounts of effort.
This also points to the importance of keeping an open mind and encouraging international trade, especially among our allies that are among the liberal democracies. Recently my firm was the target of a protest for a government contract where this connection to international trade was used as a means of questioning whether the firm was, indeed, a bonafide U.S. business. The answer under U.S. law is a resounding “yes”–and that first decision was upheld on appeal. For what we have done is–under U.S. management–leveraged technology first developed elsewhere, extended its capabilities, designed, developed, and localized it for the U.S. market, and in the process created U.S. jobs and improved U.S. processes. This is a good deal all around.
Back in the day when I wore a U.S. Navy uniform during the Cold War military, many of us in the technology and acquisition specialties looked to reform our systems and introduce innovative methods from wherever we could find them, whether they came from private industry or other government agencies. When coming upon resistance because something was “the way it always was done” our characterization of that attitude was “NIH”. That is, “Not Invented Here.” NIH was a term that, in shorthand, described an invalid counterargument against process improvement that did not rely on the merits or evidence.
And so it is today. The world is always changing, but given new technologies the rate of change is constantly accelerating. Adapting and adopting the best technologies available will continue to give us the advantage as a nation. It simply requires openness and the ability to identify innovation when we see it.
Lydia Loveless, though merely 25 years old, has been on the music scene in a big way for about six years wowing critics and music lovers with her alt-country songs, which fuses elements of trad country, rock, singer/songwriter, and punk, about life and living. She hails from the town of Coschocton, Ohio where she grew up on a farm and where her father ran a local honky-tonk for a while. A member of a musical family, she performed in the band “Carson Drew”, which drew its inspiration from the father in the Nancy Drew books series, along with her father, Parker Chandler, and older sisters, Eleanor Sinacola and Jessica.
She released her first album in 2010 entitled The Only Man. It was greeted by favorable reviews, especially on the alt-country scene. A little more than a year later she released the album Indestructible Machine on Bloodshot Records. This album of her original music dealt with issues regarding growing up in an insular rural town, dangerous relationships, and country staples such as isolation, drinking, and depression. The hard edge of her lyrics which SPIN characterized as “utter lack of bullshit” by the “Ohio hellion” appealed to a wider audience and her music was greeted with rave reviews across the critical music spectrum.
She followed up Indestructible Machine with the EP Boy Crazy, which further solidified her musical cred and which served as a segue to the full album entitled Somewhere Else. Anyone who doubted that Loveless was a major talent was converted with this album. This past August she followed that one up with another gem entitled Real. This album, as her previous efforts, has garnered almost universal praise.
As she has matured her voice, which is led by a Midwest twang, reveals great depth and control. At the core of her talent, which is multi-faceted, is her ability to exploit an expansive vocal range–one greater than found in most rock and country singers. Depending on the topic at hand she travels–sometimes in the same song–from a singer who possesses considerable pipes who can belt out a controlled and sustained melody, to verbal intimacy that expresses raw, scratchy emotion like a youthful Patti Smith. Her lyrics are both mature beyond her years and reveal an openness and emotional vulnerability that only the most talented singers can maintain. It is a high wire act by someone barely aware of what she is doing–and we can only hope that she continues to eschew any artifice of self-awareness that, even among the most talented, can devolve into self-parody and archness.
Here she is performing “Somewhere Else” on Audiotree Live.