By Abby Kaplan author of Women Talk More Than Men and Other Myths about Language Explained
For years now, observers have been alert to a growing social menace. Like Harold Hill, they warn that there’s trouble in River City — with a capital T, and that rhymes with P, and that stands for Phone.
Mobile phones are a multifaceted scourge; they’ve been blamed for everything from poor social skills to short attention spans. As a linguist, I’m intrigued by one particular claim: that texting makes people illiterate. Not only are text messages short (and thus unsuited for complex ideas), they’re riddled with near-uninterpretable abbreviations: idk, pls, gr8. Young people are especially vulnerable to these altered forms; critics frequently raise the specter of future students studying a Hamlet who texts 2B or not 2B.
The puzzling thing is that none of these abominable abbreviations are unique to text messaging, or even to electronic communication more generally. There’s nothing inherently wrong with acronyms and initialisms like idk; similar abbreviations like RSVP are perfectly acceptable, even in formal writing. The only difference is that idk, lol, and other ‘textisms’ don’t happen to be on the list of abbreviations that are widely accepted in formal contexts. Non-acronym shortenings like pls for please are similarly unremarkable; they’re no different in kind from appt for appointment.
Less obvious is the status of abbreviations like gr8, which use the rebus principle: 8 is supposed to be read, not as the number between 7 and 9, but as the sound of the English word that it stands for. The conventions for formal written English don’t have anything similar. But just because a technique isn’t used in formal English writing doesn’t mean that technique is linguistically suspect; in fact, there are other written traditions that use exactly this principle. In Ancient Egyptian, for example, the following hieroglyph was used to represent the word ḥr ‘face’:
It’s not a coincidence, of course, that the symbol for the word meaning ‘face’ looks like a face. But the same symbol could also be used to represent the sound of that word embedded inside a larger word. For example, the word ḥryt ‘terror’ could be written as follows:
Here, the symbol has nothing to do with faces, just as the 8 in gr8 has nothing to do with numbers. The rebus principle was an important part of hieroglpyhic writing, and I’ve never heard anyone argue that this practice led to the downfall of ancient Egyptian civilization. So why do we think textisms are so dangerous?
Even if there’s nothing wrong with these abbreviations in principle, it could still be that using them interferes with your ability to read and write the standard language. If you see idk and pls on a daily basis, maybe you’ll have a hard time remembering that they’re informal (as opposed to RSVP and appt). But on the other hand, all these abbreviations require considerable linguistic sophistication — maybe texting actually improves your literacy by encouraging you to play with language. We all command a range of styles in spoken language, from formal to informal, and we’re very good at adjusting our speech to the situation; why couldn’t we do the same thing in writing?
At the end of the day, the only way to find out what texting really does is to go out and study it in the real world. And that’s exactly what research teams in the UK, the US, and Australia have done. The research in this area has found no consistent negative effect of texting; in fact, a few studies have even suggested that texting might have a modest benefit. It seems that all the weeping and gnashing of teeth about the end of literacy as we know it was premature: the apocalypse is not nigh.
Of course, this doesn’t mean that we should all spend every spare minute texting. (I’m a reluctant texter myself, and I have zero interest in related services like Twitter.) There are plenty of reasons to be thoughtful about how we use any technology, mobile phones included. What we’ve seen here is just that the linguistic argument against texting doesn’t hold water.
View the Women Talk More Than Men…and Other Language Myths Explained Book Trailer or by clicking on the image below…
Blog post written by Yellowlees Douglas author of The Reader’s Brain: How Neuroscience Can Make You A Better Writer
Journalists, particularly those writing for American audiences, practically have transitions drilled into their heads from their first forays into writing for the public. Where’s your transition? their editors persist, as they linger over each sentence. However, those editors and newsroom sages handed on advice with well-established roots in psycholinguistics—and with particularly striking benefits for the reading public. I explore what linguistics, psychology, and neuroscience can teach us about writing in my forthcoming The Reader’s Brain: How Neuroscience Can Make You a Better Writer. And using an abundance of transitions is perhaps the simplest advice you can follow to make your writing easy to read, in addition to bolstering your readers’ speed and comprehension of even complex, academic prose.
As a species, we evolved to learn from observing cause and effect—and from making predictions based on those observations. For example, your everyday survival relies on your ability to predict how the driver to your right will behave on entering a roundabout, just as we predict hundreds of events that unfold in our daily lives, all of which dictate our behavior. But we feel relatively minimal cognitive strain from all these predictions, mostly made without any conscious awareness, because we can make predictions based on prior experience. We expect the familiar.
Similarly, in reading, we expect sequential sentences to relate to one another. However, most writers assume that their readers see the ideas represented in one sentence as inherently connected to the preceding sentence. But sentences can become islands of meaning, especially when writers fail to provide explicit linguistic cues that inform readers how one sentence follows another.
Take, for example, your typical university mission statement, the kind invariably featured in American university catalogues and websites:
Teaching—undergraduate and graduate through the doctorate—is the fundamental purpose of the university. Research and scholarship are integral to the education process and to expanding humankind’s understanding of the natural world, the mind and the senses. Service is the university’s obligation to share the benefits of its knowledge for the public good.
Chances are, even if someone offered you the lottery jackpot for recalling this content in a mere half-hour, you’d fail—at least, not without some serious sweat put into rote memoriziation. Why? Despite the mission statement containing a mere three sentences, nothing connects any sentence to the others—aside from the writer’s implicit belief that everyone knows that universities focus on teaching, research, and service. Unfortunately, only an academic would understand that research, teaching, and service form the bedrock of any research university. As a result, we can safely guess that the writer was an academic. Sadly, the actual audience for the mission statement—the family members tendering up their retirement savings or mortgaging the house for tuition—fail to see any connections at all. As studies documented as early as the 1970s, readers read these apparently disconnected sentences more slowly and with greater activity in the parts of the brain dedicated to reading. In addition, readers also show poorer recall of sentences lacking any apparently logical or referential continuity.
Because prediction is the engine that enables readers’ comprehension, transitions play a vital role in enabling us to understand how sentences refer to one another. In fact, certain types of transitions—particularly those flagging causation, time, space, protagonist, and motivation—bind sentences more tightly together. When you use as a result, thus, then, because, or therefore, your reader sees the sentence she’s about to read as causally related to the sentence she’s just read. Moreover, when writers place transitions early in sentences, prior to the verb, readers grasp the relationship before they finish making predictions about how the sentence will play out. These predictions stem from our encounters with tens of thousands of sentences we’ve previously read. But put the transition after the verb, and your readers have already completed the heavy lifting of prediction. Or, worse, they’ve made the wrong predictions and need to reread your sentences again.
You might think that a snippet like too or also or even flies beneath your readers’ radar. Think again. Transitions are your readers’ linguistic lifelines that link sentences and ideas smoothly together, making your reading easy to understand and recall. You can discover more about not only transitions but also of how your readers’ brains work through every facet of your writing—from the words you choose to the cadence of your sentences in The Reader’s Brain: How Neuroscience Can Make You a Better Writer.
Blog post written by Terttu Nevalainen based on an article in the latest issue of English Language and Linguistics
Social media and mobile technology have increased and accelerated human interconnectedness and social networking on a global scale. It is a common observation that new words and expressions travel fast in these networks. But our primary medium of communication is still spoken interaction, and this is how language is transmitted to the next generation. Linguists have long been interested in the influence of social networks on language learning, use and, ultimately, on language change. They have shown how people either tend to maintain the language they once acquired or become more apt to change it depending on the kinds of social network relationships they have contracted. Now, two questions intrigue the language historian. First, is it possible to gain access to the social networks of people who lived centuries ago and, second, to the extent that it is, could such networks perhaps be used to trace language change?
Present-day sociolinguistic studies approach these issues by using various methods first to chart network structures and their content, and then record and analyse the network members’ language use in different contexts. Historical studies present particular challenges in both respects. My article highlights the role of place in historical social network research. I begin by discussing the kinds of data – official documents, personal letters and diaries – that social historians have used in reconstructing past communities and social networks. I suggest that these analyses can be enriched by adding linguistic data, and that language historians’ findings on linguistic change may often be interpreted in terms of social networks.
Focusing on Early Modern London, I present two case studies. The first one investigates a 16th-century merchant family exchange network, active in London, Calais and Antwerp, in an analysis based on their extensive family correspondence. Personal letters are supplemented with the wealth of information provided by a private diary in the second case study of the 17th-century naval administrator Samuel Pepys. Pepys’s role as a community broker between the commercial City of London and the Royal Court at Westminster is assessed in linguistic terms. My results show how identifying the leaders and laggers of linguistic change can add to our understanding of the varied ways in which linguistic innovations spread to and from Tudor and Stuart London both within and across social networks.
We invite you to read the full article ‘Social networks and language change in Tudor and Stuart London – only connect?’ here
Post written by author Deborah Brandt discussing her recently published book The Rise of Writing
The belief that writing ability is a subsidiary of reading ability runs deep in society and schooling. You can only write as well as you can read. The best way to learn how to write is to read, read, and read some more. Commonplaces like these are easy to find in the advice of teachers and often well-known authors as well. Reading is considered the fundamental skill, the prior skill, the formative skill, the gateway to writing. At minimum, reading is thought to teach the techniques of textuality, the vocabulary, diction, spelling, punctuation, and syntax that any aspiring writer must master. Even more profound, reading is thought to shape character and intellect and provide the wisdom and worldliness that make one worthy to write. In every way reading is treated as the well from which writing springs. We need only try to reverse the commonplace advice to appreciate the superior position that reading holds. How many would readily agree that you can only read as well as you can write? Or that the best way to learn how to read is to write, write, and write some more? Writing has never attained the same formative and morally wholesome status as reading. Indeed, writing unmoored from the instructiveness of reading is often considered solipsistic and socially dangerous.
But in the wider society and over the last fifty years or more, writing has ascended as the main basis of many people’s daily literacy experiences and the main platform for their literacy development. Millions of working adults now spend four hours or more each day (sometimes, a lot more) with their hands on keyboards and their minds on audiences, writing so much, in fact, that they have little time or appetite for reading. In the so-called information economy writing has become a dominant form of labor and production. As a result, writing is eclipsing reading as the literate experience of consequence. Spurred on especially by digital technologies, writing is crowding out reading and subordinating reading to its needs. The rise of writing over reading represents a new chapter—and a new challenge– in the history of mass literacy, a challenge especially for the school, which from its founding has been much more organized around a reading literacy, around a presumption that readers would be many and writers would be few.
But now writers are becoming many. What are some of the changes that we need to pay attention to? Increasingly, people read from inside acts of writing, as they respond to others; research, edit or review other people’s writing; or search for styles or approaches to use in their own writing. “Reading to write” in school has usually meant using reading to stimulate ideas or generate content, but in the wider world reading to write actually stands for a broader, more diverse, more diffused, more sustained and more comprehensive set of practices. Increasingly, how and why we write conditions how and why we read. Relatedly, we write among other people who also write. Learning to write along with other people who write (rather than from authors who address us abstractly) is a new aspect of mass literacy development. Audiences are made up not merely (or mostly) of receptive readers but also responsive writers; increasingly people write to catalyze or anticipate other people’s writing and people read with the aim of writing back.
Further, in an information society, writing is consequential. The kind of writing done by everyday people turns the wheels of finance, law, health care, government, commerce. As the power and consequence of writing courses through the consciousness of everyday people, their acts of writing are often sites of intellectual, moral, and civic reflection- but not necessarily in the same ways as acts of reading. Reading is an internalizing process. That is why the effects of literacy have been sought mostly on the inside: in the formation of character or the quality of inner life or intellectual growth. But writing is a relentlessly externalizing process. Because writing unleashes language into the world, it engages people’s sense of power and responsibility. It can be expected to bring more wear and tear, potentially more trouble. Writing risks social exposure, blame, even, in some cases, retaliation. It requires a level of courage and ethical conviction rarely cultivated in school-based literacy and rarely measured in standard assessments of writing ability.
We are at a critical crossroad in the history of mass literacy in which relationships between writing and reading are undergoing profound change. Writing is overtaking reading as the skill of critical consequence. Until only recently writing was a minor strain in the history of mass literacy, playing second fiddle to reading. But it is surging into prominence, bringing with it a cultural history, a set of cognitive dispositions, and a developmental arc that stand in contrast to reading. As an educational community, we have been slow to incorporate these shifting relationships into the questions we ask and the perspectives that we take. That writing remains so under-studied and under-articulated in comparison to reading is perhaps our greatest challenge.
To find out more about Deborah Brandt’s new book published by Cambridge University Press please click here
Blog post written by Kevin McCafferty based on an article in the latest issue of English Language and Linguistics
The decline of first-person shall in Ireland, 1760–1890
The Irish just don’t use first-person shall, and they never have. They’ve always said Will I close the window? and We will be there soon. That’s the consensus of grammarians and other commentators from the eighteenth century onwards. And linguists who have studied Irish English in recent decades agree that shall is virtually non-existent in the English of the Irish. So ingrained is this view that the decline of shall in North America – which is now affecting British English, too – is even attributed to the influence of Irish immigrants.
This study uses the Corpus of Irish English Correspondence to look at shall/will usage in Irish English over a period of 130 years. Contrary to the grammarians’ accounts, shall was the predominant form with first-person subjects in eighteenth-century Ireland. The present-day situation, with only will being used in Irish English, did not emerge until late in the nineteenth century. This was not unique to Irish English: comparison with Canadian English shows this shift happening at roughly the same time in both varieties, raising questions about the role of the Irish in the decline of shall in North America.
We suggest that the change from shall to will in Irish English might have be due to simplification during the process of language shift. Also, since first-person will was retained in nonstandard varieties even in England long after shall was established in the standard language, increased use of will might have arisen as a result of growing literacy and the colloquialisation of English: as more texts like letters were produced by members of the lower social strata, their more nonstandard or vernacular usage made it into writing. The shift to first-person will that is apparent in Irish English would then be the outcome of language shift and greater lower-class literacy.
Read the full article ‘‘[The Irish] find much difficulty in these auxiliaries . . .putting for with the first person’: the decline of first-person in Ireland, 1760–1890’ here
Article written by Kevin McCafferty and Carolina P. Amador-Moreno
Blog post written by Réka Benczes, based on an article in the latest issue of English Language and Linguistics
One of the most intriguing – and least studied – areas of English word-formation are so-called “tautological compounds” that are formed out of synonyms (such as subject matter), or where one of the constituents is already included in the meaning of the other constituent (such as oak tree).Their oddity can be attributed to two main reasons. First, as their name, “tautological compound” implies, at face value such combinations can be considered as prime examples for the redundancy of language. Second, they do not follow normal compound-forming rules in the sense that both constituents can function as the semantic head – as opposed to “normal” English compounds, where the head element of the compound is always the right-hand member (hence apple tree is a type of tree, and not a type of apple).
Perhaps due to their quirkiness not much has been said about tautological compounds in traditional accounts of compounding, which typically relegate them to a marginal area of English. However, there is more to tautological compounds than meets the eye. First of all, the study demonstrates that the term “tautological compound” is a misnomer, as such combinations are far from being tautological or redundant in their meaning. Accordingly, the paper differentiates between hyponym-superordinate compounds (such as tuna fish and oak tree) and synonymous compounds (such as subject matter or courtyard) and claims that both types play important roles in language.
Hyponym-superordinate compounds are remnants of our early acquisition of taxonomical relations by making the link between the hierarchical levels explicit. At the same time, hyponym-superordinate compounds are also used to dignify and upgrade concepts via the conceptual metaphor more of form is more of content, whereby a linguistic unit that has a larger form is perceived to carry more information (that is, more content) than a single-word unit.
Synonymous compounds have been shown to possess an emphatic feature, which has been exploited mainly in poetic language (as in the works of Coleridge). However, synonymous compounds are still very much present in everyday language, though in a slightly different form – as synonym-based blends (e.g., chillax “to calm down or relax” from chill+relax, or chivers “chills or shivers” from chill+shivers).
While tautological compounds have been around for a rather long time in the English language, they have received only very little attention (if at all) from linguists. Yet they provide fascinating insights into the motivational processes behind compounding, thereby making it necessary to assign this much-neglected category to its proper, well-deserved place within English word formation.
We invite you to read the full article ‘Repetitions which are not repetitions: the non-redundant nature of tautological compounds’ here
Blog post written by John Payne and Eva Berlage
Everything you ever wanted to know about the genitive alternation in English! The choice that speakers have between the s-genitive and the of-genitive (e.g. the production’s new director vs the new director of the production) has been the subject of much detailed research, starting with historical studies in the earlier part of the twentieth century and culminating in recent large-scale synchronic studies using modern statistical techniques. It is, as Anette Rosenbach suggests in the volume, “arguably the best researched of all syntactic alternations in English”.
This special edition, arising from a workshop organised by John Payne (Manchester) and Eva Berlage (Hamburg) at the ISLE conference in Boston in 2011, collects together four new papers. The first, by Anette Rosenbach, is above all an authoritative and masterly synopsis of all previous work on the alternation, and will be an invaluable resource for both those who are interested in the methodologies which exist for analysing syntactic variation, and for those who have a specific interest in the genitive alternation itself. However, beyond this, it also raises interesting and controversial questions which should be addressed by future research.
Factors such as the animacy, weight and definiteness of the “possessor” (e.g. the production in the production’s new director), as well as the nature of the semantic relation holding between the possessor and “possessee”, are well-known to play an important role in speaker choice. The remaining three papers add new dimensions by undertaking detailed quantitative studies of previously under-investigated aspects of the alternation. Ehret, Wolk and Szmrecsanyi, using historical data from the ARCHER corpus, expand the discussion of weight by comparing different methods of assessing weight, in particular the use of word and character counts. Their research shows that length does not have a linear effect on the distribution of the s– and of-genitive. The authors also break new ground including a detailed study of the role that rhythmic effects might play. While the so-called Principle of Rhythmic Alternation (following Schlüter 2005: 18) so far only comes out as minor determinant of the variation, the paper raises the question of whether to include other operationalisations of phonological variables for a fuller understanding of the variation. In Jankowski & Tagliamonte’s contribution, there is an innovative focus on sociolinguistic factors. In particular, the authors investigate the distribution of the s-genitive and of-genitive in vernacular Canadian English, basing their research on socially stratified corpora that represent data from speakers of all age groups. Their research shows that use of the s-genitive has been growing with possessors that represent collectives or organisations, a trend that might also be spreading to place possessors. The volume concludes with a paper by Payne and Berlage on the “niche” role that the less frequently used oblique genitive, the construction we see in examples such as a friend of the director’s, plays in the alternation, providing a new quantitative analysis of the factors which make this construction either a forced (or preferred) choice in comparison with the two main constructions. A qualitative comparison of the s-genitive, of-genitive and oblique genitive finally reveals that the semantic relations represented by the oblique genitive are as subset of those covered by the s-genitive which, again, are a subset of those available to the of-genitive.
Explore the entire special issue of English Language and Linguistics here
Post written by Peter Siemund based on an article in the latest issue of English Language and Linguistics
If we are to believe standard grammatical descriptions, English possesses only very few reflexive verbs, i.e. verbs that obligatorily occur with the reflexive marker myself, yourself, himself, etc. Quirk et al. (1985: 357–8), for example, treat the verbs pride, absent, avail, demean, ingratiate, perjure as ‘reflexive verbs’, as these obligatorily take the reflexive pronoun. Besides these, they distinguish ‘semi-reflexive verbs’ (e.g. behave, feel, adjust, prepare) “where the reflexive pronoun may be omitted with little or no change of meaning” (Quirk et al. 1985: 358). A similar list of “verbs that select mandatory reflexives” is discussed in Huddleston & Pullum (2002: 1487–8). Both grammars suggest that the list of obligatorily reflexive verbs in English is not very extensive.
Geniušienė (1987) and Siemund (2010) offer extensive lists of verbs (motion middles, anticausatives, lexicalizations) that occur together with reflexive pronouns. Nevertheless, these studies are purely synchronic, analyzing a sample of fictional texts and a sample drawn from the British National Corpus (BNC) respectively. Peitsara (1997) also offers verb lists, though not differentiating between reflexive and middles uses of the verbs in these lists, as her focus lies on strategies of reflexive marking.
The main aim of the present contribution is to add a diachronic dimension to these studies that traces the history of reflexive-marked verbs in middle functions through time. To that end, the history of the verbs that partake in the aforementioned processes is scrutinized using the Oxford English Dictionary (OED; Simpson & Weiner 1989) as a database. I here explore if and when the relevant verbs begin to occur with reflexive pronouns in essentially non-reflexive functions. The result is a fine-grained survey of the history of reflexive verbs in English that can inform and correct current assumptions, as reflected in grammar books and dictionaries, about grammaticalization and lexicalization processes in this domain, perhaps even in general. Moreover, my study adds a puzzle piece to the numerous changes that have occurred in the English lexicon. The Oxford English Dictionary proves to be a rich and highly valuable data source for carrying out serious grammatical analyses.
Read the full article ‘The emergence of English reflexive verbs: an analysis based on the Oxford English Dictionary‘ here
by Louise Cummings
Nottingham Trent University, UK
As academic researchers, linguists are increasingly being asked to demonstrate the impact of their work on the lives of individuals and on the growth of national economies. There is one field within linguistics where impact is more readily demonstrated than in any other. This is the study of the many ways in which language and communication can break down or fail to develop normally in children and adults with communication disorders. These disorders are the focus of a recently published handbook, the Cambridge Handbook of Communication Disorders, which brings together 30 chapters on all aspects of the classification, assessment and treatment of communication disorders. The chapters in this volume will speak for themselves. My purpose in this short extract is to demonstrate how, in an age of impact, the case for the academic study and clinical management of communication disorders could not be more persuasive.
I begin by revisiting a quotation which I included in the preface to the handbook. It is a comment which was made in 2006 by Lord Ramsbotham, the then Chief Inspector of Prisons in the UK. He remarked: ‘When I went to the young offender establishment at Polmont, I was walking with the governor, who told me that if, by some mischance, he had to get rid of all his staff, the last one out of the gate would be his speech and language therapist’. This statement focuses attention quite forcefully on an issue which clinicians and educationalists have known for years: the remediation of impoverished language and communication skills can have a significant, positive impact on one’s life chances and experiences in a range of areas. These areas include social integration, psychological well-being and occupational and educational success. Conversely, the neglect of language and communication impairments presents a significant barrier to academic achievement, vocational functioning and social participation. The area of professional practice which aims to mitigate these harmful consequences of communication disorders – speech and language therapy (UK) or speech-language pathology (US) – has played an increasingly important role in recent years in raising awareness of these disorders. That increased awareness has been felt not just among members of the public in the form of greater tolerance and understanding of communication disorders, but also in policy areas which have the power to transform the provision and delivery of speech and language therapy services.
“It is clear that a society which neglects communication disorders among its citizens can expect to sustain significant economic harm“.
If the human impact of communication disorders does not persuade the reader of the merits of this area of academic and clinical work, then perhaps the economic implications of these disorders will make the case even more convincingly. A report1 commissioned by the Royal College of Speech and Language Therapists in the UK and published in 2010 found that speech and language therapy across aphasia, specific language impairment and autism delivers an estimated net benefit of £765 million to the British economy each year. In 2000, the economic cost of communication disorders in the US was estimated to be between $154 billion and $186 billion per year, which is equal to 2.5% to 3% of the Gross National Product.2 It is clear that a society which neglects communication disorders among its citizens can expect to sustain significant economic harm. This is in addition to the abdication of any type of social responsibility to the welfare of its people.
1 Marsh, K., Bertranou, E., Suominen, H. and Venkatachalam, M. (2010) An Economic Evaluation of Speech and Language Therapy. Matrix Evidence.
2 Ruben, R.J. (2000) ‘Redefining the survival of the fittest: Communication disorders in the 21st century’, Laryngoscope, 110 (2 Pt 1): 241-245.
The Cambridge Handbook of Communication Disorders, is now available from Cambridge University Press.
by Julie Tetel Andresen
Duke University, North Carolina
My favorite words in Romanian are those of Turkish origin. Because parts of present-day Romania were under Ottoman rule for a long time, it’s natural that Romanian would have lexical borrowings from Turkish. One is the word for tulip. Now, tulips are not native to Holland. They are native to Central Asia, and in the eighteenth century there was a craze for tulips at the Ottoman court, and images of tulips could be found on clothing and furniture, while real tulips flourished in gardens and parks. Still today the tulip is a symbol for Turkey. The English word ‘tulip’ comes from the Turkish word tulbend ‘turban’ because the flower resembles the shape of a turban. However, the Turkish word is lâle, and the Romanian word is lalea.
Why do I like this word? Because it’s fun to say, especially in the plural: ‘tulips’ is lalele and ‘the tulips’ is lalelele. There’s ‘coffee’ cafea, ‘coffees’ cafele, and ‘coffees’ cafelele. Same goes for ‘hinge’ balama, plural ‘hinges’ balamale and ‘the hinges’ balamalele and for ‘crane (piece of construction equipment)’ macara, ‘cranes’ macarale and ‘cranes’ macaralele. Not all Turkish borrowings have the phonetic form that generates these plurals, and not all words in Romanian with this plural type come from Turkish, but most of them do.
The other reason I like Turkish borrowings in Romanian is they often come with nice semantic twists. The word belea is usually used in the plural belele and means ‘troubles,’ which is tinged almost, but not quite, with a sense of the ridiculous. When I think of ‘my troubles’ as belelele mele, they don’t seem so bad. And what could be better than the word beizadea ‘son of a bei, a high ranking Turkish official’? It would never be used in Romanian as a compliment, and we need such a word in English, because entitled spoiled brat doesn’t quite cover it.
Finally, there’s the Romanian word for ‘neighborhood, suburb’ mahala, and it, too, is freighted with negative connotations. The politică de mahala, which includes personal attacks and reckless speech, would characterize much of what’s gone on in Washington DC is recent years. Those readers with knowledge of Arabic will recognize the root halla ‘to lodge’ with the place prefix ma-, making a word that means something like ‘building.’ So, the Turkish borrowing is itself a borrowing from Arabic. This word was also borrowed into Persian and is immortalized in the name Taj Mahal, which means in Persian ‘best of buildings.’ So, in the western extent of this etymon, we have a down-market usage, while in the eastern extent, we find something beautiful. Romania has its beauties, too. They’re found in the language.