The purpose of this discussion was to introduce ourselves and talk about what discipline of IT currently interests us the most
Hi, my name is Tovi and I am from New Jersey. I like the ocean. My major is Digital Multimedia Design. Based on the readings this week I have found the most appealing IT discipline to be Software Engineering.
What appeals to me most about software engineering is that it is a human process and there are a lot of possibilities within the discipline.
After researching this a little more I learned that this discipline isn’t just about knowing how to code, but that it’s creative and often changing. The website wearedevelopers.com lists character trails like curiosity, open-mindedness, and being good at prioritizing as traits that benefit this discipline. Being analytical and being adaptable are also strengths of a software engineer.
The website built.com lists careers like: Front-End Engineer, Back-End Engineer, Full-Stack Engineer, Security Engineer, and DevOps Engineer as possibilities within the software engineering discipline. I like the fact that software engineers build systems based on human needs and also maintain these systems if things go wrong.
It seems that software engineering brings ideas to life (digitally), which to me is a strength within the field of IST. A possible limitation might be that when we are happy with a software it remains the same for a while, when improvements could be made based on changes in technology and how people are using it on a daily basis.
Source: https://builtin.com/learn/careers/software-engineer https://www.wearedevelopers.com/magazine/characteristics-of-a-software-engineer-strengths-and-traits
Replies
Hi Rehan,
I think it's great that you get to work along side software engineers while also pursuing the role of devops engineer. IST is not my major, but I feel software engineer would be a good fit for me to because it monitors performances and it tests the software to make sure that it works well enough to use, and I like problem solving and fixing things. I found an interesting link showing the key differences between software engineer and devops engineer and have included it below.
DevOps Engineer vs. Software Engineer: Key Differences and Similarities
https://pg-p.ctme.caltech.edu/blog/devops/devops-engineer-vs-software-engineer
Hi Kakii,
You have some nice hobbies and to turn coding into a career makes sense. Pursuing societal issues in relation to computer science is a very interesting path. I found an article from MIT that talks about ethics and social responsibilities of students who are designing, developing, and bringing new technology to the world. I included it below. Cybersecurity would be one possibility from this discipline, as well as various forms of design.
"Bringing the Societal and Ethical Responsibilities of Computing to the Forefront"
https://news.mit.edu/2023/bringing-social-ethical-responsibilities-computing-forefront-0608
The purpose of this discussion was to practice binary code and talk about making binary code readable
Below is my translation. I approached the task by using an online ASCII table in order to get each individual character. There are faster ways by using an online converting table, where you can input the entire sequence in at once and get the converted text. I played around with both and the translation was the same. I didn't really encounter any challenges. This activity reinforces that the role of binary code in computing is a useful way to translate information in a universally understood way.
1. Hello
2. This is a binary code
3. I love data
Replies
Hi Al,
We got the the same translation as expected. I did not use paper for this, but I did start buy using a table first to get a feel of how a computer would translate. I also checked my answers from the table by using a translator and it all check out to be the same. I think this exercise shows how universal binary language is and the reason it is used in computing. On a side note, I also had to edit for a formatting error. Seems like sometimes when a post is submitted there is a glitch and a large white space shows up. Not sure if this is what happened to you. I had to go back in and fix it.
Hi Kyle,
Our translations were consistent. I also began by using an ASCII table and then checked my answers from the table by using a translator and it all check out to be the same. When I reflect on the process of translation, I feel this assignment shows how binary code works universally as a form of digital communication.
The purpose of this lesson was to help each other clarify what we don't understand about points descriptive, predictive, and prescriptive analytics
1A.
I think both the lesson and the zyBooks did a good job discussing the terms descriptive, predictive, and prescriptive in relation to data analytics. I understand that when you have a set of data and want to understand what it means then using descriptive analysis is used, If you want to foretell what might happen based on previous data, you can use predictive analysis, and to give recommendations based on the data, we use prescriptive analysis. Generally it was not that unclear.
1B.
If I had to pick something in a muddy category it might be about how we know that the data being used is accurate and has a wide enough range to actually make acceptable predictions and prescriptions.
The below link from the Harvard Business School online shows six examples of prescriptive analytics, but it also discusses descriptive and predictive. The article also talks about a fourth way to analyze data and that would be diagnostic, in this case the diagnosis would be interpreting why the data happened. I think it is always helpful to look at examples and a variety of interpretations to better understand this content from a real world lens.
What is Predictive Analytics, 6 examples
https://online.hbs.edu/blog/post/prescriptive-analytics
Replies
Hi Dustin.
Yes, businesses that use prescriptive data analysis typically stick to the guided nature of this form of analytics because prescriptive analytics want to help businesses make better decisions in the future to cut down on guesswork. The article I am liking below is by IBM and it gives some specific examples of how companies use prescriptive data.
Some of the examples given where organizations may use prescription analytics were “customer segmentation, churn prediction, fraud detection, risk assessment, demand forecasting, prescriptive maintenance and personalized recommendations.” (IBM). These examples made it a little more clear to me as to what kind of prescriptive information would be important to a company.
What is Prescriptive Analytics?
https://www.ibm.com/topics/prescriptive-analytics
Hi Regina,
The three types of analytics presented in the lesson do appear to be straightforward, but what I found when doing some outside research was just how each form of analytics plays a part in providing a service to a company or organization.
One interesting thing that I found was that most of the searches I did resulted in not only descriptive, predictive, and prescriptive analytics, but also diagnostics analytics which would be where the analysis looks at the “why” of the data.
The following link has some information about why data analysis is important. I think it’s one thing to understand the definitions, but seeing how and where it is used also helps to understand it as a career path or why it might be important if you are a business owner who might want data to reduce risky decisions or to find areas for growth.
The Importance of Data Analysis: An Overview of Data Analytics
https://www.cdata.com/blog/importance-of-data-analysis
This lesson served to initiate discussion about the intricacies of computer hardware
Component Functions:
Storage Devices
Storage devices both internal and external are key to the computing system, whether we are talking about a personal computer or one connected to a server. We are living in a world where software and work/personal files are getting much much bigger, so the CPU needs a way to run more efficiently with these changes.
We learned that RAM is temporary, but allows us to access files more quickly, while the Drive stores the data even when the electricity is turned off. Technology has advanced from HDD (hard disk drive) to SDD (solid state drive) so that there are no more spinning disks--meaning less heat, less cooling, less chance of losing the drive due to malfunction. Additionally many CPUs have a cache for even faster speed and performance, where smaller amounts of memory are stored, but are accessible much much faster.
One of the things I find important in the advancement of storage technology is that we can now very easily move stored data with devices like a USB flash drive. Although we now have cloud storage and wireless data transfer, from a hardware perspective the USB flash drive has allowed users to easily access and share data, and possibly also back it up data depending on the size of the external drive. This is critical for photographers and graphic designers who are often transferring data files. The external flash drive can also help the CPU run more efficiently because not all data needs to be stored on the main computer.
Trends and Innovations:
With artificial intelligence expanding by the second, there is a huge need for making sure that the new technology works fluidly with computer hardware. The linked article from MIT talks about how some of the newest AI is targeting deep learning and is capable of even faster mathematical computation with much less energy use. It talks about how inorganic materials like phosphosilicate glass (PSG) are being use to create brain-like neurons and synapses. Resistors can run a million times faster than previous devices for this process and do not slow computer hardware down. These nano devices and nano hardware are much different than the basics we learned about this week, but these trends and innovations are just going to continue to grow and develop.
New hardware offers faster computation for artificial intelligence, with much less energy
https://news.mit.edu/2022/analog-deep-learning-ai-computing-0728
Sources:
The Pennsylvania State University. (2024). Lesson 04: Hardware. In IST110: Information, People, and Technology. Course offered at the Pennsylvania State University’s World Campus.
Replies
Hi Rehan,
I find the entire process of how the CPU fetches instructions from the RAM and then decodes it to be a fascinating process considering how much information we are asking of our computers everyday. You did a good job simplifying the steps that the CPU takes in a very understandable way. I already knew a little bit about the motherboard and how it is like the heart of the computer connects and operates all of the hardware components, but it was nice to get a refresher from the lesson.
The following link talks about the future of motherboards and the different trends in this particular hardware design and manufacturing. Some of the things that seem to be design targets are speed, quality of materials, are the materials green, and energy saving technologies. It’s interesting to look back at what hardware used to look like, what it looks like now, and what the future of the components might be.
The Future of Motherboards: Emerging Technologies and Trends
https://dtdemos.com/20240415/the-future-of-motherboards-emerging-technologies-and-trends/
Hi John,
You make a lot of good points when looking at the timeline of the hardware evolution and how it is really obvious when we look at what old computers that took up rooms looked like and then compare it to the computers that we can now carry in a backpack.
It’s also interesting to look at old gaming consoles compared to today and see the advancing technology of the hardware components.
I found an article that was pretty well research and clearly displays the evolution of video game consoles, then ends with some information on what the future might hold. It doesn’t seem as though the consoles themselves are that much smaller, it’s just that the technology is much more advanced. But when we look at the VR technology, the corresponding hardware needs to have incredible processing advancements in order to keep up with the changing software.
The evolution of video game consoles: A journey through generations
https://onesaharan.com/resources/evolution-of-video-game-consoles/
This discussion asked us to write a break-up or love letter to one computer software that we have interacted with
Dear Apple Weather app,
Thank you for keeping me informed each day with your mostly accurate weather forecasts. You always fill my days with fun and I look forward to seeing you. I appreciate when it is raining and you create clear and understandable graphics that show light rain drops or heavy ones with clouds—this might seem simple, but you almost always make so much sense. You are pretty good at what you do, not perfect, but no one is. When I want to know more about what the weather will look like for the day or even the week, with the touch of a button you show me,
You are my favorite application because I like how you not only make it easy for me to see what the weather is like where I live, but also in every state or even around the world. This way when I travel I know what to bring and what to expect.
Finally thank you for being so straight forward. When things get serious in nature and I need to know what to expect, you send me alerts and tell me how to prepared and what not to do, so for that I thank you.
Yours Truly,
Tovi
Replies
Hi Kyle,
I don't use Windows, but it seems like Windows 11 was a little bit of a disaster. It is frustrating when an operating system is lagging and needs constant updates. It is interesting that sometimes we look forward to a new version of something and it turns out to be a let down. Having to do system restarts is annoying. I hope that you and Windows 11 can at least remain friends.
Hi Kunyang,
Sounds like Microsoft Word needs some therapy. I use Apple pages, but always have to convert my files to either pdf or Microsoft Word, but I don't actually use Microsoft as the writing application. In Pages, I don't have the problems that Microsoft Word seems to give you. There are no pop-up, not many updates, or other surprises. The only somewhat annoying thing is that I have to change the file type when submitting most papers because Word documents seem to be what most people use. I am surprised by this, seeing your letter and how many problems the application brings. I hope you find a better application. If you happen to have an Apple product, Pages works well.
The purpose of this lesson was to have a conversation about the roles and characteristics of machine, assembly, and high-level programming languages
Evolution and Trends:
When we look historically at the evolution of program languages we see early on how machine language used binary codes to work along with the hardware. So programmers also needed to have strong knowledge of hardware.
Assembly language was an upgrade because rather than binary code, symbolic language was used. But again, this language was linked to the hardware making it more difficult to use.
Fortran followed and allowed programmers to write code that was translated by a compiler. Fortran really made programming more user friendly. Next were higher level languages and we saw advancements like C programming language which was better for programming software and operating systems. The time line then moves to object oriented programming, then onto scripted programming such as Python and Java. Both Java and Python really advanced the way websites are now developed.
Each language has advanced both software and hardware. So if the future continues with this trend, then hardware and software should as well. It will be interesting to see these advancements unfold. Artificial Intelligence will likely play a role in reshaping this trend and should be something to consider as programming continues to change and evolve.
Source:
https://ipython.ai/evolution-programming-languages-history/
Learning Curve:
C++ might be the hardest programming language to learn. The learning curve is often steeper because C++ focuses on CPU memory management that requires a high level of attention to detail. Its syntax is also very complex. It's just a language that might be better for someone with more computer knowledge. But it is something anyone with patience could learn, it just might not be the easiest.
Source:
https://www.therightsphere.com/blog/is-c-hard-to-learn/
When I searched for for learning resources I came up with a variety of websites, many of which are free and offer introduction to most kinds of programing. The following website lists and describes ten that are pretty well known.
https://www.freecodecamp.org/news/learn-to-code-in-2021-10-free-websites-for-learning-coding/
Replies
Hi Khwaja,
Your explanation of the fundamental difference between machine languages through assembly languages and to high level language was clear and concise. When the lesson started it seemed like working closely with the hardware would be more practical as there might be less room for error from translating the script into binary. But that does not seem to be the case. High level language is speedy, efficient, and generally more useful. I would agree that there is still some legitimacy for machine language still, in systems needing to be worked more closely with the hardware interface.
In addition to the zyBooks lesson, I found an article which breaks down the timelines of program languages with explanations that I think is also helpful.
Hi Morgan
Your explanation of the role of each type of language in modern software development was very clear. It's interesting to look at each, almost side by side, and see what the advantages and disadvantages are. While machine language uses binary and is more difficult to understand, it still has it's advantages. Since that is the type of programming language that I was least familiar with, I research it a little further. In the below link, it talks about what Machine Language is still good for. It talks about it allowing for more control over a system and also that the user gains a better understanding of the processor and memory function. As we learning, there is a direct link to hardware, and it is basically transparent. In higher level languages, the usability might be easier, but the machine learning is straight forward once it is learned.
https://www.techopedia.com/why-is-learning-assembly-language-still-important/7/32268