Source: The Conversation (Au and NZ) – By Frances Flanagan, Sydney Fellow, Discipline of Work and Organisational Studies, University of Sydney
After heated criticism from several quarters, the federal government last month announced a meagre rise in the JobSeeker payment for people looking for work. But at just $25 more a week, the subsidy remains the second-lowest in the OECD.
The government also announced a hotline for employers to “dob in” unemployed people who reject “suitable work”. These reports will contribute evidence that may be used to reduce or stop payments to JobSeeker recipients.
As many have observed, it is a policy with all-too-familiar resonances. A pallet of punitive and stigmatising measures have been rolled out in liberal economies worldwide over the past 40 years. These include drug testing, cashless welfare cards, mandatory job training and a variety of “mutual obligation” requirements.
As many academics have argued, these measures have done much to inflict misery, and little to create jobs or foster a genuine sense of mutuality. They have also propped up the myth that individuals are overwhelmingly responsible for their own wagelessness, regardless of wider labour market conditions.
Has it always been this way?Read more: First lift JobSeeker, then add on fully-funded unemployment insurance
A long history of coercing people into work
The idea that it is desirable to use the law to coerce people into employment relationships they do not wish to be in has a long history around the world.
To understand it, we need to look to the origins of both labour law and welfare law. The English parliament passed the 1351 Statute of Labourers in the wake of another plague – the Black Death. It required any able-bodied person below the age of 60 to accept any offer of work.
The Old Poor Laws, introduced some 250 years later, were a landmark in the early history of welfare provision. They granted parishes the power to offer support to able-bodied people on the condition they could demonstrate they were “deserving”: that is, neither idle, dissolute, lazy, of poor character nor prone to vice. Together, these feudal and early modern laws reflected and encoded the practice of scrutinising the “unwillingness” of the wageless to work.
As the historian John Murphy has argued, lawmakers in the Australian colonies were adamant the Australian “new world” would not reproduce the injustices of the old. The colonies vigorously rejected enacting any equivalent of the Poor Law. However, in the context of a sparsely inhabited continent afflicted by labour shortages, colonial laws concerning work were nevertheless highly coercive.
Read more: New finding: boosting JobSeeker wouldn’t keep Australians away from paid work
British master and servant legislation, which granted employers sweeping powers over employees for absconding, disloyalty and desertion, backed by criminal sanctions, were adopted with even harsher penalties in Australia than in the UK.
The NSW 1828 Servants and Laborers Act provided for up to six months’ imprisonment for absenteeism or desertion: double the penalty of the equivalent UK legislation. In the 19th century, employers didn’t “dob” on workers who declined a job. Rather, they dobbed on workers who left without permission.
In the early 20th century the state intervened in labour markets with a very different objective: to enshrine the pre-conditions of “civilisation”. Justice Higgins’ famous Harvester Judgment required minimum wages calculated on the basis of “the needs of the average employee, regarded as a human being living in a civilised community”, rather than the “higgling of the market”.
In addition to establishing the legal and economic institutions for a “wage earners’ welfare state”, the imperative to break with stigmatising traditions was expressed through campaigns to entrench welfare systems in Australia that did not depend on people proving their poverty and good character as a precondition to accessing collective funds.
Instead, reformers argued welfare should be organised around the principle of contributory insurance. This involved workers, employers and the state compulsorily saving for future needs. Under such a model (which Australia later embraced on a national scale for superannuation) funds could be accessed in times of need without the taint of “charity”.
Enter the Great Depression
These interwar campaigns failed on the whole, with Queensland the only state to pass an Unemployed Workers Insurance Act in 1922 under the leadership of “Red Ted” Theodore. As unemployment rates soared over 30%, the plausibility of insurance models as solution to large-scale social need, or a panacea for welfare stigma, faded and the spectre of “deservedness” in welfare provision returned.
Depression-era systems enabled Unemployment Relief Councils to offer “sustenance work” to men below the basic wage. They also offered forms of “relief” that hinged on proof they had not only applied for work but had moved across the district in pursuit of it. Where the master and servant laws had once functioned to keep workers in place, in the 1930s “track rations” policies kept workers on the move.
The passage of wartime emergency laws added a new dimension to the state’s willingness and powers to intervene in individual employment relationships. These laws were aimed at securing the production of munitions and essential supplies and services.
The Manpower Directorate, Women’s Employment Board and National Security (Employment) regulations were used to scrutinise and curtail the movement of skilled workers between jobs. They were also used to keep “desirable” employees with particular employers and in jobs deemed essential.
These initiatives were introduced against a backdrop of Keynesian economics, which would expand after the war and drive a commitment to full employment as an economic objective. They would also engender more generous welfare systems, funded from general revenue, which required the needy to show they were “capable and willing to undertake suitable work” with no time limit on benefits or property means test.
‘Dobbing in’ may not have the desired effect
The notion of enlisting employers to “dob in” people unwilling to enter an employment relationship is likely to be ineffective, given the levels of under-employment in the labour market and the negative response the measure received from some employer groups.
Read more: No, people aren’t unemployed because they’re lazy. We should stop teaching children myths about work
It is likely, too, to deepen social division and increase the potential for exploitation of already vulnerable people. It is an illiberal policy, philosophically at odds with the notion that the parties in labour markets should be free to enter and withdraw from contracts as they see fit.
By placing the policy within the longer context of Australia’s 20th-century history of flawed “bold experiments” in regulating work and welfare, though, we can also see there are alternatives to coercing people into employment relationships.
States can actively intervene in labour markets on a principled basis to promote secure work that meets human need. They can pass policies that are designed to ameliorate, rather than inflame, ideas of “deservedness”. They can actively intervene in labour markets on the basis of the pursuit of a wider social “mission”.
It is important not to romanticise these policy initiatives in Australian history. Each one of them simultaneously deepened processes of exclusion for some, while reconfiguring new possibilities of fair treatment for others. Nevertheless, in an age of rising inequality, social fragmentation and climate change, it is worth remembering these experiments, if only as a stimulus to boldly proposing new ones for our times.
– ref. Australia has a long history of coercing people into work. There are better options than ‘dobbing in’ – https://theconversation.com/australia-has-a-long-history-of-coercing-people-into-work-there-are-better-options-than-dobbing-in-156296