Automation and the future of teachers

Discussion in 'General Education' started by consciousteach, Jul 27, 2017.

?

Can Teachers be automated?

  1. Yes

    16.7%
  2. No

    83.3%
  1. Obadiah

    Obadiah Groupie

    Joined:
    Jul 27, 2015
    Messages:
    1,358
    Likes Received:
    841

    Aug 2, 2017

    Caesar753, I'm very much in agreement. Current automation cannot effectively replace this type of human interaction, and possibly never will be able to, but I've read that current technology is able to read a student's body language to some degree. If teachers would be replaced by automation, any mechanical deficiencies would need to be ignored which would potentially detrimentally effect student progress. Back to the original thought, currently, the body language that is "read" is "interpreted" according to how the computer is programmed.

    I'm having a hard time summarizing my thoughts without writing a book instead of a post. There are a lot of ifs, ands, or buts to be considered, and I can even find arguments against my current opinions. But I feel a strong need to express what I've seen since I've been a teacher, long before the current era of technology; (I began my career with a TI 99-A in the classroom). I've seen practical and effective uses of technology, but I've also seen an aura of adoration for technology in spite of it's deficiencies, (and the beginning uses of computers had many). I've seen something else, too. I've seen various philosophies promoted and "proven" through manipulation of statistics to support various ideas, ideals, and programs. I've also seen poor management of technology, not just in schools but in other areas--how can I describe this? Sometimes the technology is so overly relied upon that effective human intervention is ignored. These three deficiencies combined with an automated classroom could lessen rather than enhance the educational experience, especially when combined with a possible Orwellian-type philosophy guiding the computer programming.

    I've noticed another current trend, not just in the above posts, but in any such discussion. It's difficult to imagine the actual implementation of such a classroom because the actual current technology and current research is much, much more expansive than desktops, laptops, and Jetson's style robots. Examples: Computers are being developed now to identify humans by their scent, replacing palm vein readers and fingerprint readers; (Good! I'm getting tired of wetting my finger in order to clock in). Virtual reality is now more than a fancy pair of glasses. Technical research facilities are being developed to construct whatevers, rather than specific instrumentation. An automated classroom will probably make a Rod Sterling film look like ancient history.
     
    stask81 likes this.
  2. stask81

    stask81 Rookie

    Joined:
    Jul 27, 2017
    Messages:
    36
    Likes Received:
    14

    Aug 2, 2017

    Could be a good time for you to start writing that book as a thought leader/influencer :), as this is a big and broad topic - partly because no one has a clue yet where school-based automation is headed ( maybe the notion of "school" as a center of collective learning, physical or virtual, is itself disrupted ).

    From reading about how modern AI works, it seems what the AI sees or interprets may no longer be according to how it was programmed. Traditional, instructional programming paradigms tell the computer what and how to do things, good for situations where the outcomes are well-defined e.g. financial transaction processing, scoring multiple-choice questions, etc.

    Modern-day AI programming paradigms are "training-based" not "instruction-based", as "instructionism" inevitably results in rigid, human-defined outcomes, not good for solving fluid, soft problems like reading body languages, deep-recognizing a disguised person, etc. The "training-based" paradigm is based on defining a software model that emulates the human neural network and then running continuous, massive test sets against the neural model program (i.e. using Big Data to train the AI) and refining the model's parameters upon failure until a point where the neural model achieves 80-90% success in its test sets.

    All the big boys - Netflix, Amazon, Google, Facebook - are hungry for our data, to build up enough Big Data to feed and train their AI. There was a video that documented how Netflix use their big data of subscriber behavior to refine their recommendation AI model and achieve 80-90% match between recommended shows and shows added in favorite list . In the same way, i think in the near term (or now), teachers could be, knowingly or unknowingly, participating in a program to help build up the big data to train the AI ( could be in the area of student body language and receptiveness of learning, analyzing student's potential strengths and weaknesses, and whichever else). Thus, I do believe that AI can be shaped and refined to way that achieves 80-90% of what we perceive as core human skills, and I can only hope it will be used for a good purpose.

    Absolutely in agreement, such power combined with a Orwellian-type philosophy would be extremely regressive for education and for our society.

     
    Obadiah likes this.

Share This Page

Members Online Now

Total: 436 (members: 0, guests: 419, robots: 17)
test