ChatGPT - artificial intelligence.

UKworkshop.co.uk

Help Support UKworkshop.co.uk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
During Uni open days a few years back I took my son along to 2 Uni's to look at Computer Science courses. To say the lecturers were full of dung is an understatement - more interested in the useless rubbish they could write research papers on (and get back slapped by their peers) that were of no use to man nor beast.

I even asked one student (who'd done 2 out of 3 years) what the difference was between POP3 & SMTP and got a blank look.

I came to the realisation that what Uni's teach on the Computer Science courses bears little resemblance to the commercial world. As for "obsolete in 5 years" - that particular lecturer is chatting the proverbial. Languages like .net and React.js are well over 5 years old and still very mainstream, and design patterns like MVC are as current as it gets, even tho they first came to light over 10yrs ago.

I told my lad to skip Computer Science and study Engineering.

Worked with a few graduates and uni really does nothing to prepare people for the reality of the workplace but it is important to focus on the fact it's called computer science. Plenty of interesting stuff in there don't get me wrong but like you say, understanding the theory behind entropy is great but implementing systems or writing commercialy viable code is just not in the syllbabus. Hell the last lot I dealt with had no idea about writing tests or use git...

More useful to look at software focused courses if they exist or even better still an apprenticeship combined with study, much better way to learn and far less debt.
 
I find this stuff a bit crazy. The AI art app is really terrifiyingly good.
My initial thought is that by using it then we are effectively training it to understand us and thus control or kill us much more effectively.
Also giving the developers of it cycles and cycles of free development. I have read a lot of sci fi books, but that doesn`t mean its not scary.

Ollie
 
This is all fasinateing stuff, have you seen the website for “Replika” where you have a virtual friend, I listened to a radio program where they were speaking to the developer about it, I think an american university, and after testing it on campus the developers faced a barage of opposition when they said they were going to end the trial, people saying they had made “real friends” and confidants, some were engaged romanticly with their “ai friend” and the thing had taken on a life of its own. I dont pretend to really know how it works, but I believe its the ability to link together information and actually “learn” thats the big leap.
Steve.
 
Last edited:
More useful to look at software focused courses if they exist or even better still an apprenticeship combined with study, much better way to learn and far less debt.
Software focused courses - certainly the ones from private providers: the "trainers" are just that and if you ask a question "not on their list", you might as well be staring at a wall, i.e. no real answer. The common theme is you don't actually get practitioners doing the training - they do exist but most L&D departments would wet the bed if you rocked up asking to go on one (the cost).

Apprenticeships - I can't speak on these except for the one\s I witnessed. It was a "data" apprenticeship at one of the large supermarkets chains in the UK. The person spent most of their time on their phone - snapchat, whatsapp etc.

[I have better visual acuity than the average 6/6 - not tons better but somewhat better - so, from a bank of desks away, I could tell you what app was open on their phone, but not read the actual text.]

Or if they weren't on the phone - the person was filing their nails.

For apprenticeships to work - I think you'd need (at least 2 things): a culture that encourages real learning and acceptance that it is a journey & not a destination and secondly: actual talent to learn from, not twits.

The last bit applies equally to coming into somewhere with a degree.
 
Computing and code left me behind at the ZX Spectrum so I have no claim to any knowledge of how all this works.
I do recall some years ago hearing a discussion about the difference between AI and human intelligence. The gist of it was that humans use 'top down' processing, so if we see an orange we know it's an orange, whereas AI will assess the object based on a string of physical characteristics to decide what it is.
Obviously top down processing is not infallible, a wax or plastic fruit may be sufficiently realistic that we are fooled, but once we pick it up the weight, texture, smell and so on will tell us we are wrong. If AI has access to devices to replicate our combined senses, not to mention the host of other detectors we have developed to enhance these rather limited senses, then I wonder how long it will take before it decides the 'wetware' isn't worth hanging on to?

Sorry for such a Monday morning ramble, but I am interested to hear the thoughts of those who are more familiar with the subject
 
Computing and code left me behind at the ZX Spectrum so I have no claim to any knowledge of how all this works.
I do recall some years ago hearing a discussion about the difference between AI and human intelligence. The gist of it was that humans use 'top down' processing, so if we see an orange we know it's an orange, whereas AI will assess the object based on a string of physical characteristics to decide what it is.
Obviously top down processing is not infallible, a wax or plastic fruit may be sufficiently realistic that we are fooled, but once we pick it up the weight, texture, smell and so on will tell us we are wrong. If AI has access to devices to replicate our combined senses, not to mention the host of other detectors we have developed to enhance these rather limited senses, then I wonder how long it will take before it decides the 'wetware' isn't worth hanging on to?

Sorry for such a Monday morning ramble, but I am interested to hear the thoughts of those who are more familiar with the subject
"Top down" sounds like matching against a pattern - like facial recognition software, which machines already know how to do.
Machines will "need" us until a point is reached whereby they can completely manage and oversee their own manufacture, i.e. reproduce. At which point evolution could kick in, as long as random variations are survivable.
 
Software focused courses - certainly the ones from private providers: the "trainers" are just that and if you ask a question "not on their list", you might as well be staring at a wall, i.e. no real answer. The common theme is you don't actually get practitioners doing the training
When I was doing my HND in electronic's the lecturers were mostly ok, but come the major project it was a lecturer who had years of real world experience that taught me real world design and methods, plus his approach to teaching software was completely different to the academic's but just worked. Some lecturers knew there stuff but were not able to get it across in a way that could be used in a real world application and I think that has always been a problem in some subjects where technology can be so fluent as they can only teach what they know.
 
Had to look that up. Not into Zen and the art of, but I'll look at comments on Lila. So much to read and so little time!
Reading other peoples "comments" is not a good idea at that level, since you might be (intellectually) accepting, the overlay they are casting on the original work.
 
When I was doing my HND in electronic's the lecturers were mostly ok, but come the major project it was a lecturer who had years of real world experience that taught me real world design and methods, plus his approach to teaching software was completely different to the academic's but just worked. Some lecturers knew there stuff but were not able to get it across in a way that could be used in a real world application and I think that has always been a problem in some subjects where technology can be so fluent as they can only teach what they know.
In all fairness, there is no Godly reason for the rubbish level of teaching Uni's for Computer Science.

Databases: Oracle, MySQL & SQL Server have been around for many decades​
Networking: IP4 is still in use and isn't going anywhere, nor is the networking kit (routers, switches, etc) and all the things that hang off it like DNS.​
Email: SMPT is still the delivery mechanism and POP3\IMAP the mailbox retrieval method.​
Languages: the amount of things written in .net, Java, React.js & even PHP (using MVC patterns) are well into their 2nd decade if not 3rd.​
The amount of teaching hours on the above on any Computer Science BSc, is bordering on zero. So the real question is - what useless rubbish are they teaching them?
 
In all fairness, there is no Godly reason for the rubbish level of teaching Uni's for Computer Science.

Databases: Oracle, MySQL & SQL Server have been around for many decades​
Networking: IP4 is still in use and isn't going anywhere, nor is the networking kit (routers, switches, etc) and all the things that hang off it like DNS.​
Email: SMPT is still the delivery mechanism and POP3\IMAP the mailbox retrieval method.​
Languages: the amount of things written in .net, Java, React.js & even PHP (using MVC patterns) are well into their 2nd decade if not 3rd.​
The amount of teaching hours on the above on any Computer Science BSc, is bordering on zero. So the real question is - what useless rubbish are they teaching them?
Whilst database technology such as SQL based systems has been around since the early 1980's, database technology has movedon as well. The relational database is still pre-eminent, but other no-sql type databases exist as well as not everything fits into a relational structure, no matter how big a hammer we hit it with. Very large, event driven data tends to bog down in tables and other patterns for data storage and manipulation work well. The skill is now working out what the roght tools is. Many organisations are struggling with this as they have invested tens of millions in SQL based systems. The organisaion I work for has invested 100's and possibly 1,000's of millions.

IP4 hasn't yet gone, though we do have to get rid of it ASAP. The model will become IP4 in the home and small organisation but we have to move to IPV6 as there is simply no choice, we have run out of IPV4 addresses. Therefore large orgs will need to migrate their external connectivity to IPV6 and handle the exchange (not sodding MS Exchange) in the edge routers. There are a hell of a lot more things hanging off IP than DNS though. Look through List of TCP and UDP port numbers - Wikipedia for an interesting list of ports. Yes I was a network architect doing this sort of stuff.

SMTP (not SMPT), Simple Mail Transfer Protocol is still around and I see nothing replacing it in the near or long term. The computer science grads (I was one), should be looking at the replacements for this. No doubt it will be SMTP V2 that is backwardly compatible to preserve the 100's billions spent on mail servers. There's loads of research going on around this area, but the millions of servers that run PostFix or Sendmail will act as a millstone for the next 20 years.

Language development is very hot, see Rust as an example. .net is dead really now, we still have systems in place but they're getting fewer and fewer. Java is still big but the use is declining (slowly). There's so many issues with the frameworks and versioning, with usage (I agree that kids come out of uni with poor development skills). Other issues with Java include garbage collection and memory usage. My final year thesis was on garbage collectiion in Lisp, which pretty much nails my colours to the mast :) React.js is really just a framework for JavaScript which is slowly being repalaced by TypeScript. TypeScript is what JavaScript should have been with strong typing. MicroSoft is doing a good job on TypeScript and they need to keep pushing it. PHP, well, its still around, I used it since mid 1990's and it was fine as a hack. I'm not a fan of the system and the error management. It is also dofficult to keep secure unless you really know what you are doing.

We see lots of grads, most appear to be using Python as one of their main languages. Python is OK and is gaining ground in data sciences. We tend to look on grads and their degree's as people to be moulded into developers rather than ready to work. The issue with comp science degrees now is that the range of possible studies is 10x more than when I did mine. We had C, we had Unix, we had Lisp, Pascal and Cobol and if you were really unlucky you could use Fortran. Graphic interfaces were just popping up, but the study of MMI (Man Machine Interfaces) was very elementary. On the other hand, we got to build compilers, I've built around 5-6, we got to play with hardware and actually build CPU's. Theses worked in Hz and had discrete logic chips and a two bit architecture.

Businesses don't want or need people like me for their IT dept, they need people who can keep their MS Exchange server working, keep the website going, handle the network comms to their storage arrays.Most of this has been abstracted away and on the whole just works. No business needs a new compiler writing, or new IP V4 protocols, but they do need people to configure their network firewalls. A degree student has three years, possibly four to learn. It's impossible now to cover the wide range of computer subjects in 3-4 years. I'm pretty good at what I do, but I'm struggling to keep up.

I am following the AI work with interest though.

Just my 2p

Rob
 
but we have to move to IPV6 as there is simply no choice, we have run out of IPV4 addresses.
Similar to the move from analogue Tv to digital, not entirely for our benefit but because they were running out of bandwidth in the electromagnetic spectrum and needed to free up space.

Not easy to grasp AI, seems like there are a lot of different strains around and at what point can it be classed as true AI. I would say it has to be when it works free from any pre programed code, it would have to have wrote it's own to replace the original otherwise it would be like you or me without a brain.
 
Computing and code left me behind at the ZX Spectrum so I have no claim to any knowledge of how all this works.
I do recall some years ago hearing a discussion about the difference between AI and human intelligence. The gist of it was that humans use 'top down' processing, so if we see an orange we know it's an orange, whereas AI will assess the object based on a string of physical characteristics to decide what it is.
Obviously top down processing is not infallible, a wax or plastic fruit may be sufficiently realistic that we are fooled, but once we pick it up the weight, texture, smell and so on will tell us we are wrong. If AI has access to devices to replicate our combined senses, not to mention the host of other detectors we have developed to enhance these rather limited senses, then I wonder how long it will take before it decides the 'wetware' isn't worth hanging on to?

Sorry for such a Monday morning ramble, but I am interested to hear the thoughts of those who are more familiar with the subject
I've not heard that distinction before, but it doesn't sound like a real difference between humans and AI. Humans do process lots of physical signals to decide that an object is an orange (otherwise how could we do it?) - we just don't really recognise the processing of those signals in our brain because it happens very quickly and our cognition has little value in making the decision. So it feels like 'we just know' but in truth it's more like 'out of the many possible things the object could be, our brain produces a single hypothesis with a very high prior' - AI does that too. You can test your brain's ability to do this by thinking about confusion in conversation. Humans are great at realising when there's been a misunderstanding, and winding back the conversation to fix it eg "oh right, all this time I though we were talking about wiring but you said writing - no wonder I was confused". For machines to do this is hard, not because they don't generate those alternative hypotheses about what you said, but just because they are engineered to forget them to save compute resource.
 
Humans do process lots of physical signals to decide that an object is an orange (otherwise how could we do it?) -
It is all to do with shape recognition, think about reading. We look at the page and interpret the shapes so as to form words and these are pre stored in our memory from when we learnt to read. We can connect the words to form meaningful sentences so we can understand, AI can easily have a camera that would do the same and end up with a word but no understanding. Not a problem if you are just comparing the word to a table of words where each results in some action as in a digital system but that is not AI, to be AI it would need to just make the decision based on knowledge which is a very long way off. AI would need to read and understand in order to learn which I think is still very sci-fi at the moment.
 
Worked with a few graduates and uni really does nothing to prepare people for the reality of the workplace but it is important to focus on the fact it's called computer science. Plenty of interesting stuff in there don't get me wrong but like you say, understanding the theory behind entropy is great but implementing systems or writing commercialy viable code is just not in the syllbabus. Hell the last lot I dealt with had no idea about writing tests or use git...

More useful to look at software focused courses if they exist or even better still an apprenticeship combined with study, much better way to learn and far less debt.
Worked with a few graduates and uni really does nothing to prepare people for the reality of the workplace but it is important to focus on the fact it's called computer science. Plenty of interesting stuff in there don't get me wrong but like you say, understanding the theory behind entropy is great but implementing systems or writing commercialy viable code is just not in the syllbabus. Hell the last lot I dealt with had no idea about writing tests or use git...

More useful to look at software focused courses if they exist or even better still an apprenticeship combined with study, much better way to learn and far less debt.
One thing I find interesting about machine learning is how it is refocussing computer science on the science bit. Of course we need coders, mathematicians and engineers to make AI happen, but we also need people who understand experimental method and rigour, and how to design studies to tease out functional causality from systems of unknown structure. That's subtly different from normal testing, and fascinating to observe for me, as I was trained as a lab scientist and am currently working in AI.
 
Considering the end product that "teachers" are apparently content to be churning out these days, one might think that the questions should have been along the lines of how to do their jobs better.
It's interesting the level of responsibility put on teachers these days. Pretty hard to produce decent outputs when you have kids and parents that have no desire to engage with school and often actively disrupt others. Perhaps if parents put in the effort the kids might follow. Seems like schools are expected to raise peoples kids but without any level of authority over them.

My other half talking our son to school the other day saw a parent smoking a spliff on the way to school with their kid trailing behind. What hope does that kid have? My other half spoke to the school and they said they could do nothing about it and the problem is so bad with some kids that they have to leave their book bags outside as they stink so much. These are 5/6 yr olds. Likely to be a little bit baked and unlikely to have had a proper breakfast and you expect them to learn and it's the teachers fault if they don't?

I live in a rural area and it's a pretty good school, so can't imagine how much worse a bad school is.

Maybe a lot of people should ask ChatGPT how to be a better parent
 

Latest posts

Back
Top