Lately, the topic of tech evolution seemed to pop up on my news radar a lot and it got me to thinking about and pondering some questions on the matter.
What defines appropriate, moderated incorporation of modernizing technology into the art of music? Is it to be expected that things eventually start changing and departing from the past or are some things needed to stay just permanently in the un-modernized realm, so that the learning process itself does not become compromised? I took note of a few examples that breach the subject and question on hand:
Musician Dave Grohl, the director and producer behind the “Sound City”documentary that came out this past February, felt the desire to tell the story behind this iconic studio, as well as give people an outlet to tell their individual stories that crossed paths with the place. Hearing the history of Sound City is fascinating in and of itself but one of the most poignant subjects Grohl brought unabashedly to the forefront, was how he felt, and continues to feel, about the importance of maintaining the vitality of work done in the analog music space and recognizing the effort that goes into doing so.
“…I remember every night, we [Nirvana] would bring a cassette back to the Oakwood hotel where we were staying, and listen to what we had done [at Sound City] that day. Those imperfections, that’s cool. And it makes it sound like people…to me, Sound City represents some sort of integrity…which is very human. Actual people [making music,] that inspired millions and millions of fans all over the place to do the same thing…telling the story of Sound City is one thing. Plugging in and actually putting it through the board and on a two-inch reel? That’s what I’m talking about.”
Following a revisitation of this film, I asked on my Twitter feed,not long ago, what people’s thoughts were on the idea that bands who take upward of three to four years to complete new albums, should turn more to the use of analog production methods. The one direct response I received took the stance that it “sounds very expensive.” That might very well be true when comparing the costs of producing an identical work via today’s digital methods.
Combine that with the industry’s state of profit in some sectors being what it is and trying to save money makes sense. Still, if trying to stay above the line between profit and money loss is what determines a lot of the “stay or go,” gargantuan decisions in the business, then we might be able to keep up some semblance of money making longer than if we had stayed old school, but, we will do so at the expense of the very base of what makes music…well, music. Exactly what do I mean by that? Take a look at a potential “evolution” in instrument design that has some people, non-musicians and professionals alike, intrigued:
“The 150-year-old German polymer giant, Bayer MaterialSciencehas fashioned [a] “futuristic cello” from utterly transparent, lightweight cast resin. The company’s aim was to encourage aspiring musicians to take up the cello by making what is normally a rather cumbersome instrument, easier to play and carry.
[Designers and lighting specialists] found that music students wanted color signals to let them know when they were out of tune or to flash like a metronome to help them keep time…”
It might not be wrong to want to modernize the cello and other physically laborious instruments like it, so that there is less likely hood of damage, it’s easier to transport…etc. After all, electrified versions of stringed instruments, as well as other families of instruments, have long since been created and updated over the decades. However, when it comes to other concepts for change coming out of the woodworks, (ironic pun completely, totally intended) especially ones like a desire to depend on color and light, that would fundamentally alter how a student becomes familiar with proper playing and execution, do we risk taking tech too far beyond appropriate boundaries that shape the development of internalized musical skill?
Quite simply,music is a sound based art. One’s ears need to be trained and honed in on different timbres, pitches, accuracies and inaccuracies, among other things. If, for example, the ideas behind the Cello 2.0 were indeed put into more widespread circulation and became anywhere near “the new standard” for initiating instrument training and education, then what’s to become of the overall development of neural processes and relationships relevant to musical structure, like those researched in the scholarly collection, “Cognitive Neuroscience of Music?” How might they be affected in the long run?
Naming just a few:
Recall (Free, Cued, Serial)
Just think of the mountains of studies that would need to be conducted to get any kind of accurate picture on how such a severe mixing of sight dependency on sound recognition would affect the bottom line of a student’s internalized skill over the course of a lifetime of practice…
If color signals were to alert students to missed pitches, let a student be exposed to that for enough years and what kind of effect would that have? Absent a color signal, would students be able to recall and or identify key pitches like A4 or C4 and sing out nearby tones with some observable level of relative pitch? Would they have weaker, or even worse, a complete lack, of the ability to properly identify incorrect intervals and more intricate tone relationships without the color present?
A recent articlepublished in The Telegraph, by music critic Ivan Hewitt, elaborated on a keynote scheduled to be part of the “Battle of Ideas”that took place in London’s Barbican Center between October 19-20. Hewitt exposed an angle and another potential way modern day life could be stunting human development in connection with music –specifically how we listen to it. He does this, not in the same way I just alluded to above, but, its complementary relevance to the main question/issue stands, as it further highlights the risks tech-centric western culture is bringing to music. Hewitt questions whether all the exposure to tech and our shorter attention spans are ruining our ability to listen to music:
“Increasingly our young folk are growing up without even a minimal acquaintance with [the] language [of music]. And all of us, young and old alike, find it harder to create the calm mental space needed to focus on a developing musical argument. As Christopher Lasch liked to put it, we’re “distracted from distraction by distraction”.
So one vital question is: how do we get the necessary knowledge across to the next generation? Traditionally there have been two ways. There’s the knowledge you get from doing something; playing the piano, using composing software. And there’s the knowledge of the music itself, which used to be taught in classrooms under the heading “music appreciation”. We need to ask which comes first; the doing or the listening. My feeling is it has to be the former.”
Returning to the visual, while not everyone turns to memorization in their musical training, (meaning there is a degree of recognition in regards to seeing and playing the right notes,) the visual principles of music theory are typically unveiled in tandemwith exposure to, and learning of, tonal variations. The intricacies of the two, and how they define one another, are clearly explained, much like what Hewitt refers to in his quote. They are shown to have distinct connections to one another that makes the otherwise vast sea of sound into a medium capable of being reigned in and independently shaped for compositional purposes.
Yes, electric instruments have been around for a long while. Even though there are some extreme purists who might have avoided the oncoming design and use of those modern changes as they happened in history, the difference in being able to defend those changes and “modern upgrades,” is in the fact that something like an electric guitar added a previously unheardkind of timbre and performance style to the compositional world. The electronic aspect genuinelyadded something to the instrument world and the range of musical options of the time. In other words, it broke serious new ground. In the case of the electric guitar, the steady movement to creating and including digital components is an augmentation rather than a diminishment.
The thing is though, that just like discoveries and developments in other fields like biology or mathematics, our predecessors had more leeway to discover and create things that were seen as truly new and or different. There’s no reinventing the wheel. Nowadays all the major, “easy” changes have been done. The more time goes on, the more we are not really creating something new, so much as adjusting what has already been provided. I sometimes think to myself, “All we have left are the really hard questions, like, how do we cure cancer? How can we travel faster than the speed of light? What’s the solution to creating energy that will be sustainable and affordable in the long run?”
Just the same, here in the present day, electronic and digital reshaping of musical instruments, along with our relationships to, and understanding of them, isn’t breaking new ground. Instead, it really feels more like digitizing for the sake of a potential trend over function. The breakdown of Cello 2.0 comes across as convenience in size and convenience in learning. The cello was built the size it was for a reason. As for the learning with light and color aspect: it would be a newer way to learn but, just because it is new, does that make it best or right?
In keeping with this line of thinking, would you agree it makes sense to retain things like analog recording processes, acoustic instruments and other, more manual, aspects of music education, solidly active where students are involved? The whole tone of Sound City’s life story is that most of the world saw its analog ways fall back in the dust and then inevitably close, while Grohl believes analog’s value is important to continually acknowledge. It is fine if the business landscape is going to change and active professionals want to mix things up for performance or image sake. However, I don’t believe it would ever be a good thing for ‘what came before’ to completely fade into the background and have aspiring generations start out as entirely removed from what got music to the sleek, modern, computer driven point it is now.
This post isn’t about taking the stance that all things digital have to go.
It’s more about asking if we need to keep perspective on the whole of music’s timeline amidst our ongoing desire for change. Even if things inevitably slip, however slowly or rapidly, down the digitizing and modernizing slope, shouldn’t the root understanding of sound and the capture thereof, be something we are all invested in retaining too –no matter how fast the world’s processors get or how highly defined our touchscreens become? Failing to keep that root as a priority through the generations seems like the slow murder of our own comprehension of an entire branch of physics and the auditory phenomena that have naturally occurred around us since the dawn of time.
If such a “death” were to happen, the music business landscape would become undeniably changed and gain an irreversible skew in what specific keeps the business in music business steady and financially alive. After all, counsumer wants and business needs change, parallel to the trends we push. The only way that wouldn’t happen is if we dreamt and brainstormed without everacting upon our ideas –and that would just be a plain loss for everybody.