Monday, May 17, 2021

reify (Michael Schwalbe)

 reify (Michael Schwalbe)

1.  Michael Schwalbe, 'The sociologically examined life' (reify)      [ ]
 “reify”
 “”
reify [< L. res, thing (see REAL) + FY] to treat (an abstraction) as substantially existing, or as a concrete material object--reification n.

Alfred Korzybski's work maintained that human beings are limited in what they know by
     (1) the structure of their nervous systems, and
     (2) the structure of their languages.

[pp.21-23]
It is not easy to become and remain mindful of the social world as humanly made.  For many reason the social world seems to be "just there," as if no one were responsible for making it.  So what?  What difference does it make if we forget that the social world is a human invention?  The difference it makes is like that between using one's tools with an awareness of what they are good for and letting those tools--as if they had minds and will of their own--take charge.
    The failure to see the world as humanly made is called reification, which can also be defined as the tendency to see the humanly made world as having a will and force of its own, apart from human beings.  For example, someone might say, “Computer technology is the major force behind changes in our economy today.”  In this statement, computer technology is reified because it is spoken of as having a will of its own, independent of human beings.  It is technology that appears to make things happen.
    "Computer technology," however, is only metal and plastic.  People forget these materials, turn them into computers and other devices, and then decide how to put such tools to work.  All along the way there are people who choose what to build and how to use the results.  But if we talk about technology as if it were a force in its own right, the people who do the building ([designing  lobbying  consulting  planning  executing  inter-acting  influencing  'controlling the access']) and choosing disappear.  It thus seems as if technology is like gravity or the wind--a natural force about which we can do nothing.
    Reification keeps us from seeing that the force attributed to technology comes from PEOPLE choosing to do things together in certain ways.  If we don't see this, we may forget to ask important questions, such as, Who is choosing to build what kinds of devices?  Why?  How will our society be changed?  Who stands to benefit and who stands to lose because of these changes?  Should we avoid these changes?  Who will be held accountable if these changes hurt people?  Should we decide to use technology in some other ways?
    Here is another example of reification: “The market responded with enthusiasm to today's rise in interest rates, although economists predict that this could have unfavorable consequences for employment.”  You've probably heard this kind of statement before.  It sounds like a report about a flood or some other natural disaster.  Yet a market is just a lot people doing things together in a certain way; interest rates established by people; and employment results from choices by employers.  Reification makes these people and their choices disappear.
    In a large complex society the tendency to reify is strong because it can be hard to see where, how, and by whom decisions are made.  And so it is easier to say that technology, the market or a mysterious THEY is making things happen.  Even people who ought to know better get caught up in this.  When sociologists say things like “Trends in inner-city industrial development are causing changes in family structure,” they too are guilty of reification.  Such language again makes it seem as if no one is responsible for choosing to act in a way that hurts or helps others.
    Reification thus keeps us from seeing who is doing what to whom, and how, such that certain consequences arise.  This makes it hard to hold anyone accountable for the good or bad results arising from their actions.  Usually it is powerful people whose actions are hidden and who get off the hook.
    Reification can also make us feel powerless because the social world comes to seem like a place that is beyond human control.  If we attribute independent force to abstractions such as "technology," "the market," "government," "trends," "social structure," or "society," then it can seem pointless even to try to intervene and make things happen differently.  We might as well try to stop the tides.  People who think this way are likely to remain passive even when they see others being put out of work, living in poverty, or caught up in war, because they will feel that nothing can be done.
    When we reify the social world we are confusing its reality with that of stars and trees and bacteria.  These things indeed exist (as material entities) independent of human ideas and action.  But no part of the social world does.  To reify is to forget this; it is to forget to be mindful of the social world as a humanly made place.  As a result, we forget that it is within our collective power to re-create the world in a better way.  If we are sociologically mindful, we recognize that the social world as it now exists is just one of many possibilities.

  (Schwalbe, Michael, 1956-, The sociologically examined life: pieces of the conversation, copyright © 2008, 2005, 2001, 1998
)
(The sociologically examined life: pieces of the conversation / Michael Schwalbe.--4th ed., 1. sociology--methodology., 2. sociology--philosophy., pp.21-23 )
 <---------------------------------------------------------------------------->

1:25:12
College Lecture Series - Neil Postman - "The Surrender of Culture to Technology"
https://youtu.be/hlrv7DIHllE?t=173
https://youtu.be/hlrv7DIHllE?t=173
https://www.youtube.com/watch?v=hlrv7DIHllE
https://www.youtube.com/watch?v=hlrv7DIHllE
College of DuPage
Published on Jun 3, 2013
A lecture delivered by Neil Postman on Mar. 11, 1997 in the Arts Center. Based on the author's book of the same title. Neil Postman notes the dependence of Americans on technological advances for their own security. Americans have come to expect technological innovations to solve the larger problems of mankind. Technology itself has become a national "religion" which people take on faith as the solution to their problems.
 
7 questions
 1. what is the problem to which this technology is a solution?
 2. whose problem is it?
 3. suppose we solve this problem, and solve it decisively, what new problems might be created because we have solved the problem?
 4. which people and what institutions might be most seriously harmed by a technological solution
 5. what changes in language are being enforced by new technologies?
    what is being gained and what is being lost by such changes?
 6. what sort of people and institution acquire special economic and political power, because of technological change?
    this question needs to be asked, because the transformation of a technology into medium always results in a realignment of economic and political power.
 7. what alternative uses might be made of a technology the one proceeds here by assuming that any medium we have created is not necessarily the only one we might make of a particular technology

 https://youtu.be/hlrv7DIHllE?t=1035
 1. what is the problem to which this technology is a solution?
    now this question needs to be asked, because there are technologies that are not solution to any problem that a normal person would regard as significant

 https://youtu.be/hlrv7DIHllE?t=1440
 2. whose problem is it?
    but this question, whose problem is it, needs to be applied to any technologies. most technologies do solve some problem, but the problem may not be everybody's problem  or even most people's problem.  we need to be very careful in determining who will benefit from a technology, and who will pay for it.  they are not always the same people.  

 https://youtu.be/hlrv7DIHllE?t=1521
 3. suppose we solve this problem, and solve it decisively, what new problems might be created because we have solved the problem?
    the automobile solves some very important problems for most people

 https://youtu.be/hlrv7DIHllE?t=1740
 4. which people and what institutions might be most seriously harmed by a technological solution
 
 https://youtu.be/hlrv7DIHllE?t=2259
 5. what changes in language are being enforced by new technologies?
    what is being gained and what is being lost by such changes?

 https://youtu.be/hlrv7DIHllE?t=2746
 6. what sort of people and institution acquire special economic and political power, because of technological change?
    this question needs to be asked, because the transformation of a technology into medium always results in a realignment of economic and political power.
 
 https://youtu.be/hlrv7DIHllE?t=2925
 7. what alternative uses might be made of a technology the one proceeds here by assuming that any medium we have created is not necessarily the only one we might make of a particular technology

 https://youtu.be/hlrv7DIHllE?t=3037
 1. what is the problem to which a technology claims to be the solution
 2. whose problem is it
 3. what new problems will be created because of solving an old one
 4. which people in institutions will be most harmed
 5. what changes in language are being promoted
 6. what shifts in economic and political power are likely to result
 7. what alternative media might be made from a technology

automobile, television, computer
the same blindness, no one is asking anything worth asking 
https://youtu.be/hlrv7DIHllE?t=3629
 60:29   Tocqueville says in democracy in America
 <-------------------------------------------------------------------------->

Kelly, Kevin, 1952—
What technology wants / Kevin Kelly,
1. technology—social aspects.
2. technology and civilization.

T14.5.K45 2010
303.48'3—dc22

copyright © 2010

https://drive.google.com/open?id=1dDEBRwp3XyIKgu_bUyp18nf_9OOmpyyJ

p.194
Often we will invent a machine for a particular and limited purpose, and then, in what Neil Postman calls the Frankenstein syndrome, the invention's own agenda blossoms.  "Once the machine is built," Postman writes, "we discover, always to our surprise——that it has ideas of its own; that it is quite capable not only of changing our habits but ... of changing our habits of mind."  In this way, humans have become an adjunct to or, in Karl Marx's phrase, appendages of the machine.
 
Often we will invent a [system] for a particular and limited purpose, and then, in what Neil Postman calls the Frankenstein syndrome, the invention's own agenda blossoms.  "Once the [system] is built," Postman writes, "we discover, always to our surprise——that it has ideas of its own; that it is quite capable not only of changing our habits but ... of changing our habits of mind."  In this way, humans have become an adjunct to or, in Karl Marx's phrase, appendages of the [system].

In this way, humans have become an adjunct to or, in Karl Marx's phrase, appendages of the [system].

In this way, humans have become an adjunct to or, in Karl Marx's phrase, appendages of the [cybernetic system].

In this way, humans have become an adjunct to or, in Karl Marx's phrase, appendages of the [governing system].
 <-------------------------------------------------------------------------->

No comments:

Post a Comment

Folding Beijing (sci-fi)

 p.238 Folding Beijing: Hao Jing fang tran. Ken Liu, Uncanny magazine,  https://uncannymagazine.com/article/folding-beijing-2/ p.145 Folding...