StatAnalytica

Top 100+ Computer Engineering Project Topics [Updated]

computer engineering project topics

Computer engineering projects offer a captivating blend of creativity and technical prowess, allowing enthusiasts to dive into a world where innovation meets functionality. Whether you’re fascinated by hardware design, software development, networking, or artificial intelligence, there’s a wide array of project topics to explore within the realm of computer engineering. In this blog, we’ll delve into some intriguing computer engineering project topics, catering to both beginners and seasoned enthusiasts alike.

What Is A CSE Project?

Table of Contents

A CSE project refers to a project within the field of Computer Science and Engineering (CSE). These projects involve the application of computer science principles and engineering techniques to develop software, hardware, or systems that solve real-world problems or advance technology.

CSE projects can range from developing new algorithms and programming languages to designing and building computer hardware, networking systems, software applications, or artificial intelligence systems.

They often require interdisciplinary knowledge and skills in areas such as programming, data structures, algorithms, software engineering, hardware design, networking, and more.

How Do I Start A CSE Project?

Starting a CSE (Computer Science and Engineering) project can be an exciting endeavor, but it requires careful planning and preparation. Here’s a step-by-step guide to help you get started:

  • Define Your Project Scope and Goals:
  • Identify the problem or opportunity you want to address with your project.
  • Clearly define the objectives and outcomes you aim to achieve.
  • Determine the scope of your project, including the technologies, tools, and resources you’ll need.
  • Conduct Research:
  • Research existing solutions and technologies related to your project idea.
  • Identify any gaps or opportunities for innovation in the field.
  • Explore relevant literature, academic papers, online resources, and case studies to gain insights and inspiration.
  • Choose a Project Topic:
  • Based on your research, select a specific topic or area of focus for your project.
  • Take into account your passions, abilities, and the assets at your disposal.
  • Make sure that the topic you select corresponds with the aims and objectives of your project.
  • Develop a Project Plan:
  • Make a thorough plan for your project by writing down all the things you need to do, when you need to do them, and what you want to achieve at different points.
  • Break the project into smaller parts that are easier to handle, and if you’re working with others, make sure everyone knows what they’re responsible for.
  • Define the deliverables and criteria for success for each phase of the project.
  • Gather Resources:
  • Identify the software, hardware, and other resources you’ll need for your project.
  • Set up development environments, programming tools, and any necessary infrastructure.
  • Consider collaborating with peers, mentors, or experts who can provide guidance and support.
  • Design Your Solution:
  • Develop a conceptual design or architecture for your project.
  • Define the system requirements, data structures, algorithms, and user interfaces.
  • Consider usability, scalability, security, and other factors in your design decisions.
  • Implement Your Project:
  • Start building your project based on the design and specifications you’ve developed.
  • Write code, design user interfaces, implement algorithms, and integrate components as needed.
  • Test your project continuously throughout the development process to identify and fix any issues early on.
  • Iterate and Refine:
  • Iterate on your project based on feedback and testing results.
  • Refine your implementation, make improvements, and address any issues or challenges that arise.
  • Continuously evaluate your progress against your project plan and adjust as necessary.
  • Document Your Work:
  • Keep detailed documentation of your project, including design decisions, code comments, and user manuals.
  • Document any challenges you faced, solutions you implemented, and lessons learned throughout the project.
  • Present Your Project:
  • Prepare a presentation or demo showcasing your project’s features, functionality, and achievements.
  • Communicate your project’s goals, methodology, results, and impact effectively to your audience.
  • Solicit feedback from peers, instructors, or industry professionals to gain insights and improve your project.

By following these steps and staying organized, focused, and adaptable, you can successfully start and complete a CSE project that not only enhances your skills and knowledge but also makes a meaningful contribution to the field of computer science and engineering.

Top 100+ Computer Engineering Project Topics

  • Design and Implementation of a Simple CPU
  • Development of a Real-time Operating System Kernel
  • Construction of a Digital Signal Processor (DSP)
  • Designing an FPGA-based Video Processing System
  • Building a GPU for Parallel Computing
  • Development of a Low-Power Microcontroller System
  • Designing an Efficient Cache Memory Architecture
  • Construction of a Network-on-Chip (NoC) for Multicore Systems
  • Development of a Hardware-based Encryption Engine
  • Designing a Reconfigurable Computing Platform
  • Building a RISC-V Processor Core
  • Development of a Custom Instruction Set Architecture (ISA)
  • Designing an Energy-Efficient Embedded System
  • Construction of a High-Speed Serial Communication Interface
  • Developing a Real-time Embedded System for Robotics
  • Designing an IoT-based Home Automation System
  • Building a Wearable Health Monitoring Device
  • Development of a Wireless Sensor Network for Environmental Monitoring
  • Designing an Automotive Control System
  • Building a GPS Tracking System for Vehicles
  • Development of a Smart Grid Monitoring System
  • Designing a Digital Audio Processor for Music Synthesis
  • Building a Speech Recognition System
  • Developing a Biometric Authentication System
  • Designing a Facial Recognition Security System
  • Construction of an Autonomous Drone
  • Development of a Gesture Recognition Interface
  • Designing an Augmented Reality Application
  • Building a Virtual Reality Simulator
  • Developing a Haptic Feedback System
  • Designing a Real-time Video Streaming Platform
  • Building a Multimedia Content Delivery Network (CDN)
  • Development of a Scalable Web Server Architecture
  • Designing a Peer-to-Peer File Sharing System
  • Building a Distributed Database Management System
  • Developing a Blockchain-based Voting System
  • Designing a Secure Cryptocurrency Exchange Platform
  • Building an Anonymous Communication Network
  • Development of a Secure Email Encryption System
  • Designing a Network Intrusion Detection System (NIDS)
  • Building a Firewall with Deep Packet Inspection (DPI)
  • Developing a Vulnerability Assessment Tool
  • Designing a Secure Password Manager Application
  • Building a Malware Analysis Sandbox
  • Development of a Phishing Detection System
  • Designing a Chatbot for Customer Support
  • Building a Natural Language Processing (NLP) System
  • Developing an AI-powered Personal Assistant
  • Designing a Recommendation System for E-commerce
  • Building an Intelligent Tutoring System
  • Development of a Sentiment Analysis Tool
  • Designing an Autonomous Vehicle Navigation System
  • Building a Traffic Management System
  • Developing a Smart Parking Solution
  • Designing a Remote Health Monitoring System
  • Building a Telemedicine Platform
  • Development of a Medical Image Processing Application
  • Designing a Drug Discovery System
  • Building a Healthcare Data Analytics Platform
  • Developing a Smart Agriculture Solution
  • Designing a Crop Monitoring System
  • Building an Automated Irrigation System
  • Developing a Food Quality Inspection Tool
  • Designing a Supply Chain Management System
  • Building a Warehouse Automation Solution
  • Developing a Inventory Optimization Tool
  • Designing a Smart Retail Store System
  • Building a Self-checkout System
  • Developing a Customer Behavior Analytics Platform
  • Designing a Fraud Detection System for Banking
  • Building a Risk Management Solution
  • Developing a Personal Finance Management Application
  • Designing a Stock Market Prediction System
  • Building a Portfolio Management Tool
  • Developing a Smart Energy Management System
  • Designing a Home Energy Monitoring Solution
  • Building a Renewable Energy Integration Platform
  • Developing a Smart Grid Demand Response System
  • Designing a Disaster Management System
  • Building an Emergency Response Coordination Tool
  • Developing a Weather Prediction and Monitoring System
  • Designing a Climate Change Mitigation Solution
  • Building a Pollution Monitoring and Control System
  • Developing a Waste Management Optimization Tool
  • Designing a Smart City Infrastructure Management System
  • Building a Traffic Congestion Management Solution
  • Developing a Public Safety and Security Platform
  • Designing a Citizen Engagement and Participation System
  • Building a Smart Transportation Network
  • Developing a Smart Water Management System
  • Designing a Water Quality Monitoring and Control System
  • Building a Flood Detection and Response System
  • Developing a Coastal Erosion Prediction Tool
  • Designing an Air Quality Monitoring and Control System
  • Building a Green Building Energy Optimization Solution
  • Developing a Sustainable Transportation Planning Tool
  • Designing a Wildlife Conservation Monitoring System
  • Building a Biodiversity Mapping and Protection Platform
  • Developing a Natural Disaster Early Warning System
  • Designing a Remote Sensing and GIS Integration Solution
  • Building a Climate Change Adaptation and Resilience Platform

7 Helpful Tips for Final Year Engineering Project

Embarking on a final year engineering project can be both exhilarating and daunting. Here are seven helpful tips to guide you through the process and ensure the success of your project:

Start Early and Plan Thoroughly

  • Begin planning your project as soon as possible to allow ample time for research, design, and implementation.
  • Break down your project into smaller tasks and create a detailed timeline with milestones to track your progress.
  • Consider any potential challenges or obstacles you may encounter and plan contingencies accordingly.

Choose the Right Project

  • Select a project that aligns with your interests, skills, and career goals.
  • Ensure that the project is feasible within the time and resource constraints of your final year.
  • Seek advice from professors, mentors, or industry professionals to help you choose a project that is both challenging and achievable.

Conduct Thorough Research

  • Invest time in researching existing solutions, technologies, and literature related to your project idea.
  • Identify gaps or opportunities for innovation that your project can address.
  • Keep track of relevant papers, articles, and resources to inform your design and implementation decisions.

Communicate Effectively

  • Maintain regular communication with your project advisor or supervisor to seek guidance and feedback.
  • Collaborate effectively with teammates, if applicable, by establishing clear channels of communication and dividing tasks appropriately.
  • Practice effective communication skills when presenting your project to classmates, professors, or industry professionals.

Focus on Quality and Innovation

  • Strive for excellence in every aspect of your project, from design and implementation to documentation and presentation.
  • Try to come up with new ideas and find ways to make them better than what’s already out there.
  • Make sure you do your work carefully and make it the best it can be.

Test and Iterate

  • Test your project rigorously throughout the development process to identify and address any issues or bugs.
  • Solicit feedback from peers, advisors, or end-users to gain insights and improve your project.
  • Iterate on your design and implementation based on feedback and testing results to refine your solution and enhance its functionality.

Manage Your Time Effectively

  • Prioritize tasks and allocate time wisely to ensure that you meet deadlines and deliverables.
  • Break down larger tasks into smaller, manageable chunks and tackle them one at a time.
  • Stay organized with tools such as calendars, to-do lists, and project management software to track your progress and stay on schedule.

By following these tips and staying focused, disciplined, and proactive, you can navigate the challenges of your final year engineering project with confidence and achieve outstanding results. Remember to stay flexible and adaptable, and don’t hesitate to seek help or advice when needed. Good luck!

Computer engineering project topics offer a unique opportunity to blend creativity with technical expertise, empowering enthusiasts to explore diverse domains of computing while tackling real-world challenges. Whether you’re interested in hardware design, software development, networking, or artificial intelligence, there’s a wealth of project topics to inspire innovation and learning.

By starting these projects, people who are passionate about it can improve their abilities, learn more, and add to the changing world of technology. So, get ready to work hard, let your imagination flow, and begin an exciting adventure of learning and discovery in the amazing field of computer engineering.

Related Posts

best way to finance car

Step by Step Guide on The Best Way to Finance Car

how to get fund for business

The Best Way on How to Get Fund For Business to Grow it Efficiently

Leave a comment cancel reply.

Your email address will not be published. Required fields are marked *

Grad Coach

Research Topics & Ideas: CompSci & IT

50+ Computer Science Research Topic Ideas To Fast-Track Your Project

IT & Computer Science Research Topics

Finding and choosing a strong research topic is the critical first step when it comes to crafting a high-quality dissertation, thesis or research project. If you’ve landed on this post, chances are you’re looking for a computer science-related research topic , but aren’t sure where to start. Here, we’ll explore a variety of CompSci & IT-related research ideas and topic thought-starters, including algorithms, AI, networking, database systems, UX, information security and software engineering.

NB – This is just the start…

The topic ideation and evaluation process has multiple steps . In this post, we’ll kickstart the process by sharing some research topic ideas within the CompSci domain. This is the starting point, but to develop a well-defined research topic, you’ll need to identify a clear and convincing research gap , along with a well-justified plan of action to fill that gap.

If you’re new to the oftentimes perplexing world of research, or if this is your first time undertaking a formal academic research project, be sure to check out our free dissertation mini-course. In it, we cover the process of writing a dissertation or thesis from start to end. Be sure to also sign up for our free webinar that explores how to find a high-quality research topic. 

Overview: CompSci Research Topics

  • Algorithms & data structures
  • Artificial intelligence ( AI )
  • Computer networking
  • Database systems
  • Human-computer interaction
  • Information security (IS)
  • Software engineering
  • Examples of CompSci dissertation & theses

Topics/Ideas: Algorithms & Data Structures

  • An analysis of neural network algorithms’ accuracy for processing consumer purchase patterns
  • A systematic review of the impact of graph algorithms on data analysis and discovery in social media network analysis
  • An evaluation of machine learning algorithms used for recommender systems in streaming services
  • A review of approximation algorithm approaches for solving NP-hard problems
  • An analysis of parallel algorithms for high-performance computing of genomic data
  • The influence of data structures on optimal algorithm design and performance in Fintech
  • A Survey of algorithms applied in internet of things (IoT) systems in supply-chain management
  • A comparison of streaming algorithm performance for the detection of elephant flows
  • A systematic review and evaluation of machine learning algorithms used in facial pattern recognition
  • Exploring the performance of a decision tree-based approach for optimizing stock purchase decisions
  • Assessing the importance of complete and representative training datasets in Agricultural machine learning based decision making.
  • A Comparison of Deep learning algorithms performance for structured and unstructured datasets with “rare cases”
  • A systematic review of noise reduction best practices for machine learning algorithms in geoinformatics.
  • Exploring the feasibility of applying information theory to feature extraction in retail datasets.
  • Assessing the use case of neural network algorithms for image analysis in biodiversity assessment

Topics & Ideas: Artificial Intelligence (AI)

  • Applying deep learning algorithms for speech recognition in speech-impaired children
  • A review of the impact of artificial intelligence on decision-making processes in stock valuation
  • An evaluation of reinforcement learning algorithms used in the production of video games
  • An exploration of key developments in natural language processing and how they impacted the evolution of Chabots.
  • An analysis of the ethical and social implications of artificial intelligence-based automated marking
  • The influence of large-scale GIS datasets on artificial intelligence and machine learning developments
  • An examination of the use of artificial intelligence in orthopaedic surgery
  • The impact of explainable artificial intelligence (XAI) on transparency and trust in supply chain management
  • An evaluation of the role of artificial intelligence in financial forecasting and risk management in cryptocurrency
  • A meta-analysis of deep learning algorithm performance in predicting and cyber attacks in schools

Research topic idea mega list

Topics & Ideas: Networking

  • An analysis of the impact of 5G technology on internet penetration in rural Tanzania
  • Assessing the role of software-defined networking (SDN) in modern cloud-based computing
  • A critical analysis of network security and privacy concerns associated with Industry 4.0 investment in healthcare.
  • Exploring the influence of cloud computing on security risks in fintech.
  • An examination of the use of network function virtualization (NFV) in telecom networks in Southern America
  • Assessing the impact of edge computing on network architecture and design in IoT-based manufacturing
  • An evaluation of the challenges and opportunities in 6G wireless network adoption
  • The role of network congestion control algorithms in improving network performance on streaming platforms
  • An analysis of network coding-based approaches for data security
  • Assessing the impact of network topology on network performance and reliability in IoT-based workspaces

Free Webinar: How To Find A Dissertation Research Topic

Topics & Ideas: Database Systems

  • An analysis of big data management systems and technologies used in B2B marketing
  • The impact of NoSQL databases on data management and analysis in smart cities
  • An evaluation of the security and privacy concerns of cloud-based databases in financial organisations
  • Exploring the role of data warehousing and business intelligence in global consultancies
  • An analysis of the use of graph databases for data modelling and analysis in recommendation systems
  • The influence of the Internet of Things (IoT) on database design and management in the retail grocery industry
  • An examination of the challenges and opportunities of distributed databases in supply chain management
  • Assessing the impact of data compression algorithms on database performance and scalability in cloud computing
  • An evaluation of the use of in-memory databases for real-time data processing in patient monitoring
  • Comparing the effects of database tuning and optimization approaches in improving database performance and efficiency in omnichannel retailing

Topics & Ideas: Human-Computer Interaction

  • An analysis of the impact of mobile technology on human-computer interaction prevalence in adolescent men
  • An exploration of how artificial intelligence is changing human-computer interaction patterns in children
  • An evaluation of the usability and accessibility of web-based systems for CRM in the fast fashion retail sector
  • Assessing the influence of virtual and augmented reality on consumer purchasing patterns
  • An examination of the use of gesture-based interfaces in architecture
  • Exploring the impact of ease of use in wearable technology on geriatric user
  • Evaluating the ramifications of gamification in the Metaverse
  • A systematic review of user experience (UX) design advances associated with Augmented Reality
  • A comparison of natural language processing algorithms automation of customer response Comparing end-user perceptions of natural language processing algorithms for automated customer response
  • Analysing the impact of voice-based interfaces on purchase practices in the fast food industry

Research Topic Kickstarter - Need Help Finding A Research Topic?

Topics & Ideas: Information Security

  • A bibliometric review of current trends in cryptography for secure communication
  • An analysis of secure multi-party computation protocols and their applications in cloud-based computing
  • An investigation of the security of blockchain technology in patient health record tracking
  • A comparative study of symmetric and asymmetric encryption algorithms for instant text messaging
  • A systematic review of secure data storage solutions used for cloud computing in the fintech industry
  • An analysis of intrusion detection and prevention systems used in the healthcare sector
  • Assessing security best practices for IoT devices in political offices
  • An investigation into the role social media played in shifting regulations related to privacy and the protection of personal data
  • A comparative study of digital signature schemes adoption in property transfers
  • An assessment of the security of secure wireless communication systems used in tertiary institutions

Topics & Ideas: Software Engineering

  • A study of agile software development methodologies and their impact on project success in pharmacology
  • Investigating the impacts of software refactoring techniques and tools in blockchain-based developments
  • A study of the impact of DevOps practices on software development and delivery in the healthcare sector
  • An analysis of software architecture patterns and their impact on the maintainability and scalability of cloud-based offerings
  • A study of the impact of artificial intelligence and machine learning on software engineering practices in the education sector
  • An investigation of software testing techniques and methodologies for subscription-based offerings
  • A review of software security practices and techniques for protecting against phishing attacks from social media
  • An analysis of the impact of cloud computing on the rate of software development and deployment in the manufacturing sector
  • Exploring the impact of software development outsourcing on project success in multinational contexts
  • An investigation into the effect of poor software documentation on app success in the retail sector

CompSci & IT Dissertations/Theses

While the ideas we’ve presented above are a decent starting point for finding a CompSci-related research topic, they are fairly generic and non-specific. So, it helps to look at actual dissertations and theses to see how this all comes together.

Below, we’ve included a selection of research projects from various CompSci-related degree programs to help refine your thinking. These are actual dissertations and theses, written as part of Master’s and PhD-level programs, so they can provide some useful insight as to what a research topic looks like in practice.

  • An array-based optimization framework for query processing and data analytics (Chen, 2021)
  • Dynamic Object Partitioning and replication for cooperative cache (Asad, 2021)
  • Embedding constructural documentation in unit tests (Nassif, 2019)
  • PLASA | Programming Language for Synchronous Agents (Kilaru, 2019)
  • Healthcare Data Authentication using Deep Neural Network (Sekar, 2020)
  • Virtual Reality System for Planetary Surface Visualization and Analysis (Quach, 2019)
  • Artificial neural networks to predict share prices on the Johannesburg stock exchange (Pyon, 2021)
  • Predicting household poverty with machine learning methods: the case of Malawi (Chinyama, 2022)
  • Investigating user experience and bias mitigation of the multi-modal retrieval of historical data (Singh, 2021)
  • Detection of HTTPS malware traffic without decryption (Nyathi, 2022)
  • Redefining privacy: case study of smart health applications (Al-Zyoud, 2019)
  • A state-based approach to context modeling and computing (Yue, 2019)
  • A Novel Cooperative Intrusion Detection System for Mobile Ad Hoc Networks (Solomon, 2019)
  • HRSB-Tree for Spatio-Temporal Aggregates over Moving Regions (Paduri, 2019)

Looking at these titles, you can probably pick up that the research topics here are quite specific and narrowly-focused , compared to the generic ones presented earlier. This is an important thing to keep in mind as you develop your own research topic. That is to say, to create a top-notch research topic, you must be precise and target a specific context with specific variables of interest . In other words, you need to identify a clear, well-justified research gap.

Fast-Track Your Research Topic

If you’re still feeling a bit unsure about how to find a research topic for your Computer Science dissertation or research project, check out our Topic Kickstarter service.

You Might Also Like:

Research topics and ideas about data science and big data analytics

Investigating the impacts of software refactoring techniques and tools in blockchain-based developments.

Steps on getting this project topic

Joseph

I want to work with this topic, am requesting materials to guide.

Yadessa Dugassa

Information Technology -MSc program

Andrew Itodo

It’s really interesting but how can I have access to the materials to guide me through my work?

Sorie A. Turay

That’s my problem also.

kumar

Investigating the impacts of software refactoring techniques and tools in blockchain-based developments is in my favour. May i get the proper material about that ?

BEATRICE OSAMEGBE

BLOCKCHAIN TECHNOLOGY

Nanbon Temasgen

I NEED TOPIC

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly
  • Privacy Policy

Research Method

Home » 500+ Computer Science Research Topics

500+ Computer Science Research Topics

Computer Science Research Topics

Computer Science is a constantly evolving field that has transformed the world we live in today. With new technologies emerging every day, there are countless research opportunities in this field. Whether you are interested in artificial intelligence, machine learning, cybersecurity, data analytics, or computer networks, there are endless possibilities to explore. In this post, we will delve into some of the most interesting and important research topics in Computer Science. From the latest advancements in programming languages to the development of cutting-edge algorithms, we will explore the latest trends and innovations that are shaping the future of Computer Science. So, whether you are a student or a professional, read on to discover some of the most exciting research topics in this dynamic and rapidly expanding field.

Computer Science Research Topics

Computer Science Research Topics are as follows:

  • Using machine learning to detect and prevent cyber attacks
  • Developing algorithms for optimized resource allocation in cloud computing
  • Investigating the use of blockchain technology for secure and decentralized data storage
  • Developing intelligent chatbots for customer service
  • Investigating the effectiveness of deep learning for natural language processing
  • Developing algorithms for detecting and removing fake news from social media
  • Investigating the impact of social media on mental health
  • Developing algorithms for efficient image and video compression
  • Investigating the use of big data analytics for predictive maintenance in manufacturing
  • Developing algorithms for identifying and mitigating bias in machine learning models
  • Investigating the ethical implications of autonomous vehicles
  • Developing algorithms for detecting and preventing cyberbullying
  • Investigating the use of machine learning for personalized medicine
  • Developing algorithms for efficient and accurate speech recognition
  • Investigating the impact of social media on political polarization
  • Developing algorithms for sentiment analysis in social media data
  • Investigating the use of virtual reality in education
  • Developing algorithms for efficient data encryption and decryption
  • Investigating the impact of technology on workplace productivity
  • Developing algorithms for detecting and mitigating deepfakes
  • Investigating the use of artificial intelligence in financial trading
  • Developing algorithms for efficient database management
  • Investigating the effectiveness of online learning platforms
  • Developing algorithms for efficient and accurate facial recognition
  • Investigating the use of machine learning for predicting weather patterns
  • Developing algorithms for efficient and secure data transfer
  • Investigating the impact of technology on social skills and communication
  • Developing algorithms for efficient and accurate object recognition
  • Investigating the use of machine learning for fraud detection in finance
  • Developing algorithms for efficient and secure authentication systems
  • Investigating the impact of technology on privacy and surveillance
  • Developing algorithms for efficient and accurate handwriting recognition
  • Investigating the use of machine learning for predicting stock prices
  • Developing algorithms for efficient and secure biometric identification
  • Investigating the impact of technology on mental health and well-being
  • Developing algorithms for efficient and accurate language translation
  • Investigating the use of machine learning for personalized advertising
  • Developing algorithms for efficient and secure payment systems
  • Investigating the impact of technology on the job market and automation
  • Developing algorithms for efficient and accurate object tracking
  • Investigating the use of machine learning for predicting disease outbreaks
  • Developing algorithms for efficient and secure access control
  • Investigating the impact of technology on human behavior and decision making
  • Developing algorithms for efficient and accurate sound recognition
  • Investigating the use of machine learning for predicting customer behavior
  • Developing algorithms for efficient and secure data backup and recovery
  • Investigating the impact of technology on education and learning outcomes
  • Developing algorithms for efficient and accurate emotion recognition
  • Investigating the use of machine learning for improving healthcare outcomes
  • Developing algorithms for efficient and secure supply chain management
  • Investigating the impact of technology on cultural and societal norms
  • Developing algorithms for efficient and accurate gesture recognition
  • Investigating the use of machine learning for predicting consumer demand
  • Developing algorithms for efficient and secure cloud storage
  • Investigating the impact of technology on environmental sustainability
  • Developing algorithms for efficient and accurate voice recognition
  • Investigating the use of machine learning for improving transportation systems
  • Developing algorithms for efficient and secure mobile device management
  • Investigating the impact of technology on social inequality and access to resources
  • Machine learning for healthcare diagnosis and treatment
  • Machine Learning for Cybersecurity
  • Machine learning for personalized medicine
  • Cybersecurity threats and defense strategies
  • Big data analytics for business intelligence
  • Blockchain technology and its applications
  • Human-computer interaction in virtual reality environments
  • Artificial intelligence for autonomous vehicles
  • Natural language processing for chatbots
  • Cloud computing and its impact on the IT industry
  • Internet of Things (IoT) and smart homes
  • Robotics and automation in manufacturing
  • Augmented reality and its potential in education
  • Data mining techniques for customer relationship management
  • Computer vision for object recognition and tracking
  • Quantum computing and its applications in cryptography
  • Social media analytics and sentiment analysis
  • Recommender systems for personalized content delivery
  • Mobile computing and its impact on society
  • Bioinformatics and genomic data analysis
  • Deep learning for image and speech recognition
  • Digital signal processing and audio processing algorithms
  • Cloud storage and data security in the cloud
  • Wearable technology and its impact on healthcare
  • Computational linguistics for natural language understanding
  • Cognitive computing for decision support systems
  • Cyber-physical systems and their applications
  • Edge computing and its impact on IoT
  • Machine learning for fraud detection
  • Cryptography and its role in secure communication
  • Cybersecurity risks in the era of the Internet of Things
  • Natural language generation for automated report writing
  • 3D printing and its impact on manufacturing
  • Virtual assistants and their applications in daily life
  • Cloud-based gaming and its impact on the gaming industry
  • Computer networks and their security issues
  • Cyber forensics and its role in criminal investigations
  • Machine learning for predictive maintenance in industrial settings
  • Augmented reality for cultural heritage preservation
  • Human-robot interaction and its applications
  • Data visualization and its impact on decision-making
  • Cybersecurity in financial systems and blockchain
  • Computer graphics and animation techniques
  • Biometrics and its role in secure authentication
  • Cloud-based e-learning platforms and their impact on education
  • Natural language processing for machine translation
  • Machine learning for predictive maintenance in healthcare
  • Cybersecurity and privacy issues in social media
  • Computer vision for medical image analysis
  • Natural language generation for content creation
  • Cybersecurity challenges in cloud computing
  • Human-robot collaboration in manufacturing
  • Data mining for predicting customer churn
  • Artificial intelligence for autonomous drones
  • Cybersecurity risks in the healthcare industry
  • Machine learning for speech synthesis
  • Edge computing for low-latency applications
  • Virtual reality for mental health therapy
  • Quantum computing and its applications in finance
  • Biomedical engineering and its applications
  • Cybersecurity in autonomous systems
  • Machine learning for predictive maintenance in transportation
  • Computer vision for object detection in autonomous driving
  • Augmented reality for industrial training and simulations
  • Cloud-based cybersecurity solutions for small businesses
  • Natural language processing for knowledge management
  • Machine learning for personalized advertising
  • Cybersecurity in the supply chain management
  • Cybersecurity risks in the energy sector
  • Computer vision for facial recognition
  • Natural language processing for social media analysis
  • Machine learning for sentiment analysis in customer reviews
  • Explainable Artificial Intelligence
  • Quantum Computing
  • Blockchain Technology
  • Human-Computer Interaction
  • Natural Language Processing
  • Cloud Computing
  • Robotics and Automation
  • Augmented Reality and Virtual Reality
  • Cyber-Physical Systems
  • Computational Neuroscience
  • Big Data Analytics
  • Computer Vision
  • Cryptography and Network Security
  • Internet of Things
  • Computer Graphics and Visualization
  • Artificial Intelligence for Game Design
  • Computational Biology
  • Social Network Analysis
  • Bioinformatics
  • Distributed Systems and Middleware
  • Information Retrieval and Data Mining
  • Computer Networks
  • Mobile Computing and Wireless Networks
  • Software Engineering
  • Database Systems
  • Parallel and Distributed Computing
  • Human-Robot Interaction
  • Intelligent Transportation Systems
  • High-Performance Computing
  • Cyber-Physical Security
  • Deep Learning
  • Sensor Networks
  • Multi-Agent Systems
  • Human-Centered Computing
  • Wearable Computing
  • Knowledge Representation and Reasoning
  • Adaptive Systems
  • Brain-Computer Interface
  • Health Informatics
  • Cognitive Computing
  • Cybersecurity and Privacy
  • Internet Security
  • Cybercrime and Digital Forensics
  • Cloud Security
  • Cryptocurrencies and Digital Payments
  • Machine Learning for Natural Language Generation
  • Cognitive Robotics
  • Neural Networks
  • Semantic Web
  • Image Processing
  • Cyber Threat Intelligence
  • Secure Mobile Computing
  • Cybersecurity Education and Training
  • Privacy Preserving Techniques
  • Cyber-Physical Systems Security
  • Virtualization and Containerization
  • Machine Learning for Computer Vision
  • Network Function Virtualization
  • Cybersecurity Risk Management
  • Information Security Governance
  • Intrusion Detection and Prevention
  • Biometric Authentication
  • Machine Learning for Predictive Maintenance
  • Security in Cloud-based Environments
  • Cybersecurity for Industrial Control Systems
  • Smart Grid Security
  • Software Defined Networking
  • Quantum Cryptography
  • Security in the Internet of Things
  • Natural language processing for sentiment analysis
  • Blockchain technology for secure data sharing
  • Developing efficient algorithms for big data analysis
  • Cybersecurity for internet of things (IoT) devices
  • Human-robot interaction for industrial automation
  • Image recognition for autonomous vehicles
  • Social media analytics for marketing strategy
  • Quantum computing for solving complex problems
  • Biometric authentication for secure access control
  • Augmented reality for education and training
  • Intelligent transportation systems for traffic management
  • Predictive modeling for financial markets
  • Cloud computing for scalable data storage and processing
  • Virtual reality for therapy and mental health treatment
  • Data visualization for business intelligence
  • Recommender systems for personalized product recommendations
  • Speech recognition for voice-controlled devices
  • Mobile computing for real-time location-based services
  • Neural networks for predicting user behavior
  • Genetic algorithms for optimization problems
  • Distributed computing for parallel processing
  • Internet of things (IoT) for smart cities
  • Wireless sensor networks for environmental monitoring
  • Cloud-based gaming for high-performance gaming
  • Social network analysis for identifying influencers
  • Autonomous systems for agriculture
  • Robotics for disaster response
  • Data mining for customer segmentation
  • Computer graphics for visual effects in movies and video games
  • Virtual assistants for personalized customer service
  • Natural language understanding for chatbots
  • 3D printing for manufacturing prototypes
  • Artificial intelligence for stock trading
  • Machine learning for weather forecasting
  • Biomedical engineering for prosthetics and implants
  • Cybersecurity for financial institutions
  • Machine learning for energy consumption optimization
  • Computer vision for object tracking
  • Natural language processing for document summarization
  • Wearable technology for health and fitness monitoring
  • Internet of things (IoT) for home automation
  • Reinforcement learning for robotics control
  • Big data analytics for customer insights
  • Machine learning for supply chain optimization
  • Natural language processing for legal document analysis
  • Artificial intelligence for drug discovery
  • Computer vision for object recognition in robotics
  • Data mining for customer churn prediction
  • Autonomous systems for space exploration
  • Robotics for agriculture automation
  • Machine learning for predicting earthquakes
  • Natural language processing for sentiment analysis in customer reviews
  • Big data analytics for predicting natural disasters
  • Internet of things (IoT) for remote patient monitoring
  • Blockchain technology for digital identity management
  • Machine learning for predicting wildfire spread
  • Computer vision for gesture recognition
  • Natural language processing for automated translation
  • Big data analytics for fraud detection in banking
  • Internet of things (IoT) for smart homes
  • Robotics for warehouse automation
  • Machine learning for predicting air pollution
  • Natural language processing for medical record analysis
  • Augmented reality for architectural design
  • Big data analytics for predicting traffic congestion
  • Machine learning for predicting customer lifetime value
  • Developing algorithms for efficient and accurate text recognition
  • Natural Language Processing for Virtual Assistants
  • Natural Language Processing for Sentiment Analysis in Social Media
  • Explainable Artificial Intelligence (XAI) for Trust and Transparency
  • Deep Learning for Image and Video Retrieval
  • Edge Computing for Internet of Things (IoT) Applications
  • Data Science for Social Media Analytics
  • Cybersecurity for Critical Infrastructure Protection
  • Natural Language Processing for Text Classification
  • Quantum Computing for Optimization Problems
  • Machine Learning for Personalized Health Monitoring
  • Computer Vision for Autonomous Driving
  • Blockchain Technology for Supply Chain Management
  • Augmented Reality for Education and Training
  • Natural Language Processing for Sentiment Analysis
  • Machine Learning for Personalized Marketing
  • Big Data Analytics for Financial Fraud Detection
  • Cybersecurity for Cloud Security Assessment
  • Artificial Intelligence for Natural Language Understanding
  • Blockchain Technology for Decentralized Applications
  • Virtual Reality for Cultural Heritage Preservation
  • Natural Language Processing for Named Entity Recognition
  • Machine Learning for Customer Churn Prediction
  • Big Data Analytics for Social Network Analysis
  • Cybersecurity for Intrusion Detection and Prevention
  • Artificial Intelligence for Robotics and Automation
  • Blockchain Technology for Digital Identity Management
  • Virtual Reality for Rehabilitation and Therapy
  • Natural Language Processing for Text Summarization
  • Machine Learning for Credit Risk Assessment
  • Big Data Analytics for Fraud Detection in Healthcare
  • Cybersecurity for Internet Privacy Protection
  • Artificial Intelligence for Game Design and Development
  • Blockchain Technology for Decentralized Social Networks
  • Virtual Reality for Marketing and Advertising
  • Natural Language Processing for Opinion Mining
  • Machine Learning for Anomaly Detection
  • Big Data Analytics for Predictive Maintenance in Transportation
  • Cybersecurity for Network Security Management
  • Artificial Intelligence for Personalized News and Content Delivery
  • Blockchain Technology for Cryptocurrency Mining
  • Virtual Reality for Architectural Design and Visualization
  • Natural Language Processing for Machine Translation
  • Machine Learning for Automated Image Captioning
  • Big Data Analytics for Stock Market Prediction
  • Cybersecurity for Biometric Authentication Systems
  • Artificial Intelligence for Human-Robot Interaction
  • Blockchain Technology for Smart Grids
  • Virtual Reality for Sports Training and Simulation
  • Natural Language Processing for Question Answering Systems
  • Machine Learning for Sentiment Analysis in Customer Feedback
  • Big Data Analytics for Predictive Maintenance in Manufacturing
  • Cybersecurity for Cloud-Based Systems
  • Artificial Intelligence for Automated Journalism
  • Blockchain Technology for Intellectual Property Management
  • Virtual Reality for Therapy and Rehabilitation
  • Natural Language Processing for Language Generation
  • Machine Learning for Customer Lifetime Value Prediction
  • Big Data Analytics for Predictive Maintenance in Energy Systems
  • Cybersecurity for Secure Mobile Communication
  • Artificial Intelligence for Emotion Recognition
  • Blockchain Technology for Digital Asset Trading
  • Virtual Reality for Automotive Design and Visualization
  • Natural Language Processing for Semantic Web
  • Machine Learning for Fraud Detection in Financial Transactions
  • Big Data Analytics for Social Media Monitoring
  • Cybersecurity for Cloud Storage and Sharing
  • Artificial Intelligence for Personalized Education
  • Blockchain Technology for Secure Online Voting Systems
  • Virtual Reality for Cultural Tourism
  • Natural Language Processing for Chatbot Communication
  • Machine Learning for Medical Diagnosis and Treatment
  • Big Data Analytics for Environmental Monitoring and Management.
  • Cybersecurity for Cloud Computing Environments
  • Virtual Reality for Training and Simulation
  • Big Data Analytics for Sports Performance Analysis
  • Cybersecurity for Internet of Things (IoT) Devices
  • Artificial Intelligence for Traffic Management and Control
  • Blockchain Technology for Smart Contracts
  • Natural Language Processing for Document Summarization
  • Machine Learning for Image and Video Recognition
  • Blockchain Technology for Digital Asset Management
  • Virtual Reality for Entertainment and Gaming
  • Natural Language Processing for Opinion Mining in Online Reviews
  • Machine Learning for Customer Relationship Management
  • Big Data Analytics for Environmental Monitoring and Management
  • Cybersecurity for Network Traffic Analysis and Monitoring
  • Artificial Intelligence for Natural Language Generation
  • Blockchain Technology for Supply Chain Transparency and Traceability
  • Virtual Reality for Design and Visualization
  • Natural Language Processing for Speech Recognition
  • Machine Learning for Recommendation Systems
  • Big Data Analytics for Customer Segmentation and Targeting
  • Cybersecurity for Biometric Authentication
  • Artificial Intelligence for Human-Computer Interaction
  • Blockchain Technology for Decentralized Finance (DeFi)
  • Virtual Reality for Tourism and Cultural Heritage
  • Machine Learning for Cybersecurity Threat Detection and Prevention
  • Big Data Analytics for Healthcare Cost Reduction
  • Cybersecurity for Data Privacy and Protection
  • Artificial Intelligence for Autonomous Vehicles
  • Blockchain Technology for Cryptocurrency and Blockchain Security
  • Virtual Reality for Real Estate Visualization
  • Natural Language Processing for Question Answering
  • Big Data Analytics for Financial Markets Prediction
  • Cybersecurity for Cloud-Based Machine Learning Systems
  • Artificial Intelligence for Personalized Advertising
  • Blockchain Technology for Digital Identity Verification
  • Virtual Reality for Cultural and Language Learning
  • Natural Language Processing for Semantic Analysis
  • Machine Learning for Business Forecasting
  • Big Data Analytics for Social Media Marketing
  • Artificial Intelligence for Content Generation
  • Blockchain Technology for Smart Cities
  • Virtual Reality for Historical Reconstruction
  • Natural Language Processing for Knowledge Graph Construction
  • Machine Learning for Speech Synthesis
  • Big Data Analytics for Traffic Optimization
  • Artificial Intelligence for Social Robotics
  • Blockchain Technology for Healthcare Data Management
  • Virtual Reality for Disaster Preparedness and Response
  • Natural Language Processing for Multilingual Communication
  • Machine Learning for Emotion Recognition
  • Big Data Analytics for Human Resources Management
  • Cybersecurity for Mobile App Security
  • Artificial Intelligence for Financial Planning and Investment
  • Blockchain Technology for Energy Management
  • Virtual Reality for Cultural Preservation and Heritage.
  • Big Data Analytics for Healthcare Management
  • Cybersecurity in the Internet of Things (IoT)
  • Artificial Intelligence for Predictive Maintenance
  • Computational Biology for Drug Discovery
  • Virtual Reality for Mental Health Treatment
  • Machine Learning for Sentiment Analysis in Social Media
  • Human-Computer Interaction for User Experience Design
  • Cloud Computing for Disaster Recovery
  • Quantum Computing for Cryptography
  • Intelligent Transportation Systems for Smart Cities
  • Cybersecurity for Autonomous Vehicles
  • Artificial Intelligence for Fraud Detection in Financial Systems
  • Social Network Analysis for Marketing Campaigns
  • Cloud Computing for Video Game Streaming
  • Machine Learning for Speech Recognition
  • Augmented Reality for Architecture and Design
  • Natural Language Processing for Customer Service Chatbots
  • Machine Learning for Climate Change Prediction
  • Big Data Analytics for Social Sciences
  • Artificial Intelligence for Energy Management
  • Virtual Reality for Tourism and Travel
  • Cybersecurity for Smart Grids
  • Machine Learning for Image Recognition
  • Augmented Reality for Sports Training
  • Natural Language Processing for Content Creation
  • Cloud Computing for High-Performance Computing
  • Artificial Intelligence for Personalized Medicine
  • Virtual Reality for Architecture and Design
  • Augmented Reality for Product Visualization
  • Natural Language Processing for Language Translation
  • Cybersecurity for Cloud Computing
  • Artificial Intelligence for Supply Chain Optimization
  • Blockchain Technology for Digital Voting Systems
  • Virtual Reality for Job Training
  • Augmented Reality for Retail Shopping
  • Natural Language Processing for Sentiment Analysis in Customer Feedback
  • Cloud Computing for Mobile Application Development
  • Artificial Intelligence for Cybersecurity Threat Detection
  • Blockchain Technology for Intellectual Property Protection
  • Virtual Reality for Music Education
  • Machine Learning for Financial Forecasting
  • Augmented Reality for Medical Education
  • Natural Language Processing for News Summarization
  • Cybersecurity for Healthcare Data Protection
  • Artificial Intelligence for Autonomous Robots
  • Virtual Reality for Fitness and Health
  • Machine Learning for Natural Language Understanding
  • Augmented Reality for Museum Exhibits
  • Natural Language Processing for Chatbot Personality Development
  • Cloud Computing for Website Performance Optimization
  • Artificial Intelligence for E-commerce Recommendation Systems
  • Blockchain Technology for Supply Chain Traceability
  • Virtual Reality for Military Training
  • Augmented Reality for Advertising
  • Natural Language Processing for Chatbot Conversation Management
  • Cybersecurity for Cloud-Based Services
  • Artificial Intelligence for Agricultural Management
  • Blockchain Technology for Food Safety Assurance
  • Virtual Reality for Historical Reenactments
  • Machine Learning for Cybersecurity Incident Response.
  • Secure Multiparty Computation
  • Federated Learning
  • Internet of Things Security
  • Blockchain Scalability
  • Quantum Computing Algorithms
  • Explainable AI
  • Data Privacy in the Age of Big Data
  • Adversarial Machine Learning
  • Deep Reinforcement Learning
  • Online Learning and Streaming Algorithms
  • Graph Neural Networks
  • Automated Debugging and Fault Localization
  • Mobile Application Development
  • Software Engineering for Cloud Computing
  • Cryptocurrency Security
  • Edge Computing for Real-Time Applications
  • Natural Language Generation
  • Virtual and Augmented Reality
  • Computational Biology and Bioinformatics
  • Internet of Things Applications
  • Robotics and Autonomous Systems
  • Explainable Robotics
  • 3D Printing and Additive Manufacturing
  • Distributed Systems
  • Parallel Computing
  • Data Center Networking
  • Data Mining and Knowledge Discovery
  • Information Retrieval and Search Engines
  • Network Security and Privacy
  • Cloud Computing Security
  • Data Analytics for Business Intelligence
  • Neural Networks and Deep Learning
  • Reinforcement Learning for Robotics
  • Automated Planning and Scheduling
  • Evolutionary Computation and Genetic Algorithms
  • Formal Methods for Software Engineering
  • Computational Complexity Theory
  • Bio-inspired Computing
  • Computer Vision for Object Recognition
  • Automated Reasoning and Theorem Proving
  • Natural Language Understanding
  • Machine Learning for Healthcare
  • Scalable Distributed Systems
  • Sensor Networks and Internet of Things
  • Smart Grids and Energy Systems
  • Software Testing and Verification
  • Web Application Security
  • Wireless and Mobile Networks
  • Computer Architecture and Hardware Design
  • Digital Signal Processing
  • Game Theory and Mechanism Design
  • Multi-agent Systems
  • Evolutionary Robotics
  • Quantum Machine Learning
  • Computational Social Science
  • Explainable Recommender Systems.
  • Artificial Intelligence and its applications
  • Cloud computing and its benefits
  • Cybersecurity threats and solutions
  • Internet of Things and its impact on society
  • Virtual and Augmented Reality and its uses
  • Blockchain Technology and its potential in various industries
  • Web Development and Design
  • Digital Marketing and its effectiveness
  • Big Data and Analytics
  • Software Development Life Cycle
  • Gaming Development and its growth
  • Network Administration and Maintenance
  • Machine Learning and its uses
  • Data Warehousing and Mining
  • Computer Architecture and Design
  • Computer Graphics and Animation
  • Quantum Computing and its potential
  • Data Structures and Algorithms
  • Computer Vision and Image Processing
  • Robotics and its applications
  • Operating Systems and its functions
  • Information Theory and Coding
  • Compiler Design and Optimization
  • Computer Forensics and Cyber Crime Investigation
  • Distributed Computing and its significance
  • Artificial Neural Networks and Deep Learning
  • Cloud Storage and Backup
  • Programming Languages and their significance
  • Computer Simulation and Modeling
  • Computer Networks and its types
  • Information Security and its types
  • Computer-based Training and eLearning
  • Medical Imaging and its uses
  • Social Media Analysis and its applications
  • Human Resource Information Systems
  • Computer-Aided Design and Manufacturing
  • Multimedia Systems and Applications
  • Geographic Information Systems and its uses
  • Computer-Assisted Language Learning
  • Mobile Device Management and Security
  • Data Compression and its types
  • Knowledge Management Systems
  • Text Mining and its uses
  • Cyber Warfare and its consequences
  • Wireless Networks and its advantages
  • Computer Ethics and its importance
  • Computational Linguistics and its applications
  • Autonomous Systems and Robotics
  • Information Visualization and its importance
  • Geographic Information Retrieval and Mapping
  • Business Intelligence and its benefits
  • Digital Libraries and their significance
  • Artificial Life and Evolutionary Computation
  • Computer Music and its types
  • Virtual Teams and Collaboration
  • Computer Games and Learning
  • Semantic Web and its applications
  • Electronic Commerce and its advantages
  • Multimedia Databases and their significance
  • Computer Science Education and its importance
  • Computer-Assisted Translation and Interpretation
  • Ambient Intelligence and Smart Homes
  • Autonomous Agents and Multi-Agent Systems.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

History Research Paper Topics

500+ History Research Paper Topics

Physics Research Topics

500+ Physics Research Topics

Biology Research Topics

350+ Biology Research Topics

Controversial Research Topics

300+ Controversial Research Topics

Sports Research Topics

500+ Sports Research Topics

Cyber Security Research Topics

500+ Cyber Security Research Topics

  • How it works

researchprospect post subheader

Useful Links

How much will your dissertation cost?

Have an expert academic write your dissertation paper!

Dissertation Services

Dissertation Services

Get unlimited topic ideas and a dissertation plan for just £45.00

Order topics and plan

Order topics and plan

Get 1 free topic in your area of study with aim and justification

Yes I want the free topic

Yes I want the free topic

Computing Engineering Dissertation Topics

Published by Jamie Walker at January 10th, 2023 , Revised On August 18, 2023

Over a period of time, dissertations have become an inherent component of higher education studies. They are not only entrenched within the master or a PhD. Degree but also in undergraduate programmes. Computer engineering dissertations allow the researchers to choose a topic of particular interest to them and research further into the topic to add to the current body of literature.

However, choosing a topic from an extensive list of topics is always easier than working on the first topic you find interesting.

To help you get started with brainstorming for computer topic ideas, we have developed a list of the latest computer engineering dissertation topics that can be used for writing your computer engineering dissertation.

These topics have been developed by PhD-qualified writers of our team , so you can trust to use these topics for drafting your dissertation.

You may also want to start your dissertation by requesting  a brief research proposal  from our writers on any of these topics, which includes an  introduction  to the problem,  research questions , aim and objectives ,  literature review  along with the proposed  methodology  of research to be conducted.  Let us know  if you need any help in getting started.

Check our  example dissertations  to get an idea of  how to structure your dissertation .

You can review step by step guide on how to write your dissertation  here.

View our free dissertation topics database.

Computer Engineering Dissertation Topics

Computers are the greatest innovation of the modern era and have done wonders for mankind. There is only one language that computers understand; the binary. But there are various high-level coding languages that even computers do not understand and therefore use a compiler for translation.

Computing refers to computer hardware or software coding development technology and covers all aspects of computer technologies. It is the practical and scientific study of the implementation of computing information. Computing is also interchangeably known as computer sciences.

A computing engineer or a computer scientist specializes in practical work, the theory of computing, and the design of computational systems. Essentially, it is the study of structure, expression, mechanization, and feasibility of algorithms (logical procedures) that cause processing, communication, representation, access of information, and acquisition in a computer. This area has a wide range of topics, some of which have been listed below:

  • Risk calculation in the application and development process.
  • Generation of the java application.
  • Implementing a behavioural based approach to detect cheating in online games.
  • Analysis of coding environment of different applications.
  • Identification of different languages used for coding.
  • Identification of stake holder’s interest in App development process.
  • Role of visualization in complex hierarchal structures of computing.
  • Analysing the requirements of Inventory Management Software.
  • Development of single-player simulation game.
  • Investigation of web teaching aid system.
  • Development of online based library management system.
  • Implementation of Electronic banking system.

2022 Computing Engineering Dissertation Topics

Topic 1: an investigation of the blockchain's application on the energy sector leading towards electricity production and e-mobility..

Research Aim: This study aims to investigate the applications of blockchain within the energy sector. This study will identify how blockchain can be used to produce electricity from the comfort of home. Moreover, this study aims to introduce the concept of e-mobility through blockchain, according to which blockchain can be used to share the car ride with the other commuters residing at nearby places. Another objective of this research is to develop a framework that could assess blockchain’s use for the consumers staying within a budget and letting them assess how much money they have been spending so far.

Topic 2: Investigating the Issues that Impact Data Security in Cloud-Based Blockchain Technology: An Global Tourism Industry Case Study

Research Aim: This research focuses on a significant shift in trend found in the worldwide tourist business, which is the usage of the cloud for data and services. It attempts to supply the requirements for this implementation owing to the demand for ease, saving, and improved service providing. Furthermore, it will also focus on the limits of traditional blockchain technology primitives and assess control models. These constraints are related to security issues involving data in a cloud environment in the global tourism industry.

Topic 3: Is Digital Technology overtaking Human Interactions in the Medical Fields? An Examination of the Use of Computational biology and Machine Leaning in Patient Diagnosis and Treatment.

Research Aim: The current study seeks to examine how digital technology is replacing human interactions in the medical industry in the importance of computational biology and machine learning in patient diagnosis and treatment. This study will set forth the theoretical foundations and significance of computational biology and machine learning and will also make recommendations for further enhancement.

Topic 4: Evaluating the use of databases and information retrieval systems in the perspective of the United States National security policy.

Research Aim: The current study aims to evaluate the use of databases and information retrieval systems from the perspective of United States national security policy. This study addresses the databases and information retrieval system to provide a clear understanding. It will also focus on specific elements d criteria in the united state’s national security and highlights the benefits and drawbacks of employing them to enhance national security strategy in the united states.

Topic 5: Development of growing infusion of computer technology in the area of medicine- examining NHS policies.

Research Aim: This research aims to examine the development of the growing infusion of computer technology in the area of medicine by evaluating existing NHS policy. The study will provide a theoretical framework for the application of computer science technologies in medicine and will also set out the benefits of using contemporary computer technology as well as analyse the drawbacks that have occurred as a result of the growth of this new technology in this field. It will also focus on the policies employed by NHS to assist the development of technologies in the UK healthcare sector.

Computer Engineering Dissertation Topics for 2021

A 3-d visualization system for ultrasound images.

Research Aim: This research will focus on the visualization of 3-D ultrasound images and their medical therapy benefits.

Reliable and realistic study of remote communication systems in telephony and multipath faded systems

Research Aim: This research’s primary emphasis is on telephony’s practical implementation in a remote communication system.

Establishing a Neural Network Device

Research Aim: In terms of energy efficiency, the human brain is much greater than any modern supercomputer. A whole new generation of energy-efficient, brain-like computers is being designed for this study.

Methods for Artifact EEG Brain function study, caused by sugar, salt, fat, and their replacements

Research Aim: This study relies on the procedure for calculating sweetness taste is developed and validated. Part of the project includes modern electrode technologies to capture the purest possible brain signal from EEG equipment.

Find 100s of dissertation topics in your other academic subjects in our free topics database.

The impact of Covid-19 on tech spends in 2021

Research Aim: This research aims to study the impact of Covid-19 on tech spends in 2021.

Analysis of information system built for e-learning

Research Aim: This research aims to analyze the information system built for e-learning

Advantages and disadvantages of an information system

Research Aim: This research aims to address the advantages and disadvantages of an information system.

Covid-19 Computer Engineering Research Topics

Research to study the effects of coronavirus on it industries.

Research Aim: This research will focus on the impacts of COVID-19 on the growth of IT industries highlighting the issues responsible for it and the possible solutions to overcome them.

Research to identify the impact of Coronavirus on the computer science research community

Research Aim: Coronavirus has infected thousands of people and has been responsible for the deaths of several innocent people worldwide. This study will focus on identifying the effects of this pandemic on the computer science research community.

Research to study the impacts of COVID-19 on tech spends in 2021

Research Aim: As a result of COVID-19, the economy of the entire world has been disrupted. The purpose of this research is to know the tech expenditures after COVID-19 became widespread. How are the tech industries dealing with the challenging situation created by COVID-19?

Research to identify the contribution of computer science to control the spread of Coronavirus pandemic

Research Aim: This research aims at identifying the contributions and efforts made by computer engineers to control the pandemic. What is the role of computer scientists during the pandemic?

Research to identify the unemployment of computer engineers after the Coronavirus pandemic

Research Aim: This research will focus on identifying the increased unemployment issues raised after the COVID-19 pandemic and finding out the possible solutions to overcome the reduced unemployment of computer engineers.

Hardware, Network and Security Dissertation Topics

Network security is very crucial for any organisation. It is dependent upon a well-managed network through the implementation of policies drafted by network administrators to manage the access of the organisational information. Network security provides stability, safety, integrity, reliability, and utility of data and network.

It works efficiently with the latest hardware equipment and updated software. Network security offers many advantages to businesses, such as protection against any disruption to keep employees motivated, energetic, regular, and productive.

In certain instances, a virus may break into the network security. However, the network administrator generally uses an anti-virus program to prevent this sort of attack.

Therefore, it will be fair to say that network security plays a vital role in maintaining a business’s reputation and operations which is the most important asset to any organisation. Below is a list of topics that you can base your dissertation on:

  • Performance analysis of transmission control protocol over Ethernet LAN.
  • Gateway usage for the intrusion detection system.
  • Impact of security machinimas in online transactions.
  • Investigation of smart card specification.
  • Importance of router placement in the network.
  • Level of customer’s trust in E-banking.
  • Role of antivirus in a shared network.
  • Application of database technologies for data network management.
  • Network worm: A headache to networking.
  • Implementation of various tools in programming language.
  • Study of retroactive data structures.
  • Role of Voice over Internet Protocol over Ethernet LAN.
  • The usefulness of data transfer security over Wi-Fi Network.
  • Influence of signal strength of Wi-Fi upon data transfer.
  • Analysis of tree inclusion complexities.
  • Analysis of the implementation of the set procedure.
  • Analysis of the application of programming tools.
  • Implementation of File Sharing System in Network.
  • Study of virus behaviours in the secured programming environment.
  • Investigation of issues of user’s security and data protection over the network.
  • Benefits of network security to customers.
  • Improvements of mobile data service for future usage.
  • Study of Asymmetry k-center variant.
  • Analysis of issues in emerging 4G networks.
  • Role of dynamic proxies in a mobile environment to support Remote method Invocation.

Software, Programming and Algorithm Dissertation Topics

In layman language, the software is collectively known as the “combination of operating information and all the programs that are being used by the computer.” It is a set of instructions to direct computers to perform a specific task depending upon thususer’s instructions.

The software can be written in both high and low-level languages. Low-level language is also known as machine code and is faster because it doesn’t require any compiler and directly communicates with the computer. A high-level language is pretty similar to a human language, and therefore can be easily understood by the developers. High-level language requires the compiler to translate commands to the computer.

Programming and algorithms can be termed as commands given to the computer to perform actions. Programming leads to executable programs from a computing problem and involves developing, generating, and analysing algorithms. Algorithms refer to an act done involving a step-by-step process to solve a problem. It is a set of logic written in software.

There are two types of software; operating software that helps in operation and system software necessary to run a system. Operating software can be rewritten and changed according to demand, but system software cannot be altered. If developers require any alterations, they would have to develop new software.

There are various topics that can be considered for  research dissertation purposes  under this theme, a list of which is given below.

  • Application of algorithms.
  • Importance of approximation algorithms on graphs.
  • Critical analysis of data structures on trees.
  • Evaluation and implementation of new algorithms.
  • System software: A link to communicate hardware.
  • Difference between binary dispatching and multiple dispatching.
  • Analysis of plan sweep techniques.
  • Investigation of software support to drivers of devices.
  • Intelligent interface for database systems.
  • Analysis of function and types of union-find.
  • The usefulness of different coding languages.
  • Application of basic hardware knowledge and math skills.
  • Analysis of the design of converter based on new moduli.
  • Analysis of information travelling via software.
  • Evaluation and implementation of heuristic algorithms.
  • Development of applications using Java.
  • Analysis of fault tolerance in a network by using simulation.
  • Importance of system software for computers.
  • Effects of larger integer module operations.
  • Consequences of wrong commands in coding.
  • Investigation of the coding language of system software.
  • Analysis of feasibility environment of platform.
  • Evaluation of heuristic algorithms for generating clusters.
  • Critical analysis of fixed control variable.
  • Analysis of design of converter with large dynamic range.
  • Ways to recover corrupted software.
  • Analysis of fault tolerance of sorting network.
  • Analysis of the difference between LAN and WAN.
  • Development of an algorithm for a one-way hashing system.
  • Relation between dynamic access and fixed values.
  • Importance of right language selection while coding.
  • Study of optimization problems.
  • Analysis of security frameworks for web services.
  • Investigating algorithms techniques.
  • Partial persistence of algorithms vs others.
  • Study of time and space problems of algorithmic functions.
  • Effects of linear and logarithmic factors over programming.
  • Discussion about union-find with deletion.
  • Importance of data structure for bridge core problems.
  • Consequences of fault in interconnected networks.
  • Difference between rooted and unrooted tree.

Information Systems Dissertation Topics

Information systems refer to a group of people and computers that are being used for the interpretation of all kinds of information. Computer-based information systems are a very interesting topic for research. It includes all information regarding decision making, management support, and operations and can also be used to access the database.

There is an obvious difference between computer systems, information systems, and business processes. The information system provides the tools to manage businesses successfully.

An information system can be said to be a workstation where humans and machines work together towards the success of a business. One such example is Wal-Mart. The company is entirely based on information systems and has connected its suppliers, vendors, customers and together.

It deals with a large number of data and consists of hardware, software, network, and telecommunications of the operation. Below is a list of research topics in the field of information systems for you to base your dissertation  on:

  • Analysis of challenges in building information systems for any organisation.
  • Impact of cyberinfrastructure on the customer.
  • Role of information system in scientific innovations.
  • The usefulness of information systems for businesses.
  • Advantages of information systems.
  • Access to information systems by employees anywhere in the world.
  • Preparation of a database management system.
  • Analysis and solution of database management systems.
  • Study of support of information system to hardware.
  • Managing information systems of big stores, The case of Walmart.
  • Analysis of information system built for E-learning.
  • Critical analysis of the changing nature of the web.
  • Role of information system in decision making of disruptions.
  • Examine customer response through the information system.
  • Investigate the impact of a virus in the network
  • Relationship between I.T education and an organization.
  • Role of information system in global warming.
  • Investigate the reason for adopting green information systems.
  • Analysis of the between social networks and information systems.
  • Role of information system in dealing with complex business problems.

Important Notes:

As a computing engineering student looking to get good grades, it is essential to develop new ideas and experiment with existing computing engineering theories – i.e., to add value and interest in your research topic.

The field of computing engineering is vast and interrelated to so many other academic disciplines like civil engineering , finance , construction ,  law ,  healthcare , mental health , artificial intelligence , tourism , physiotherapy , sociology , management , marketing and nursing . That is why it is imperative to create a project management dissertation topic that is articular, sound, and actually solves a practical problem that may be rampant in the field.

We can’t stress how important it is to develop a logical research topic; it is the basis of your entire research. There are several significant downfalls to getting your topic wrong; your supervisor may not be interested in working on it, the topic has no academic creditability, the research may not make logical sense, there is a possibility that the study is not viable.

This impacts your time and efforts in  writing your dissertation  as you may end up in the cycle of rejection at the very initial stage of the dissertation. That is why we recommend reviewing existing research to develop a topic, taking advice from your supervisor, and even asking for help in this particular stage of your dissertation.

While developing a research topic, keeping our advice in mind will allow you to pick one of the best computing engineering dissertation topics that fulfill your requirement of writing a research paper and add to the body of knowledge.

Therefore, it is recommended that when finalizing your dissertation topic, you read recently published literature to identify gaps in the research that you may help fill.

Remember- dissertation topics need to be unique, solve an identified problem, be logical, and be practically implemented. Take a look at some of our sample computing engineering dissertation topics to get an idea for your own dissertation.

How to Structure your Dissertation on Computing Engineering

A well-structured   dissertation can help students   to achieve a high overall academic grade.

  • A Title Page
  • Acknowledgements
  • Declaration
  • Abstract: A summary of the research completed
  • Table of Contents
  • Introduction : This chapter includes the project rationale, research background, key research aims and objectives, and the research problems to be addressed. An outline of the structure of a dissertation  can also be added to this chapter.
  • Literature Review :  This chapter presents relevant theories and frameworks by analysing published and unpublished literature available on the chosen research topic, in light of  research questions  to be addressed. The purpose is to highlight and discuss the relative weaknesses and strengths of the selected research area whilst identifying any research gaps. Break down of the topic, and key terms can have a positive impact on your dissertation and your tutor.
  • Methodology:  The  data collection  and  analysis  methods and techniques employed by the researcher are presented in the Methodology chapter which usually includes  research design,  research philosophy, research limitations, code of conduct, ethical consideration, data collection methods and  data analysis strategy .
  • Findings and Analysis:  Findings of the research are analysed in detail under the Findings and Analysis chapter. All key findings/results are outlined in this chapter without interpreting the data or drawing any conclusions. It can be useful to include  graphs , charts, and   tables in this chapter to identify meaningful trends and relationships.
  • Discussion  and  Conclusion: The researcher presents his interpretation of results in this chapter, and states whether the research hypothesis has been verified or not. An essential aspect of this section of the paper is to draw a linkage between the results and evidence from the literature. Recommendations with regards to implications of the findings and directions for the future may also be provided. Finally, a summary of the overall research, along with final judgments, opinions, and comments, must be included in the form of suggestions for improvement.
  • References:  This should be completed in accordance with your University’s requirements
  • Bibliography
  • Appendices:  Any additional information, diagrams, graphs that were used to  complete the  dissertation  but not part of the dissertation should be included in the Appendices chapter. Essentially, the purpose is to expand the information/data.

About ResearchProspect Ltd

ResearchProspect is a  UK-based academic writing service that provides help with  Dissertation Proposal Writing ,  Ph.D. Proposal Writing ,  Dissertation Writing ,  Dissertation Editing and Improvement .

For further assistance with your dissertation, take a look at our full dissertation writing service .

Our team of writers is highly qualified. Our writers are experts in their respective fields. They have been working in the industry for a long time. Thus they are aware of the issues and the trends of the industry they are working in.

Need more Topics.?

Free Dissertation Topic

Phone Number

Academic Level Select Academic Level Undergraduate Graduate PHD

Academic Subject

Area of Research

Frequently Asked Questions

How to find dissertation topics about computing engineering.

To find computing engineering dissertation topics:

  • Explore emerging technologies.
  • Investigate industry challenges.
  • Review recent research papers.
  • Consider AI, cybersecurity, IoT.
  • Brainstorm software/hardware innovations.
  • Select a topic aligning with your passion and career aspirations.

You May Also Like

Need interesting and manageable Sexual Harassment of Women dissertation topics? Here are the trending Sexual Harassment of Women dissertation titles so you can choose the most suitable one.

Finding engaging Educational Psychology dissertation ideas could take some time or a lot of time depending on your research abilities. While some students are particularly adept at developing dissertation.

This is a list of dissertation topics related to the lives and experiences of lesbian, gay, bisexual, transgender, queer/questioning (LGBTQIA+) individuals.

USEFUL LINKS

LEARNING RESOURCES

researchprospect-reviews-trust-site

COMPANY DETAILS

Research-Prospect-Writing-Service

  • How It Works

Digital Commons @ University of South Florida

  • USF Research
  • USF Libraries

Digital Commons @ USF > College of Engineering > Computer Science and Engineering > Theses and Dissertations

Computer Science and Engineering Theses and Dissertations

Theses/dissertations from 2023 2023.

Refining the Machine Learning Pipeline for US-based Public Transit Systems , Jennifer Adorno

Insect Classification and Explainability from Image Data via Deep Learning Techniques , Tanvir Hossain Bhuiyan

Brain-Inspired Spatio-Temporal Learning with Application to Robotics , Thiago André Ferreira Medeiros

Evaluating Methods for Improving DNN Robustness Against Adversarial Attacks , Laureano Griffin

Analyzing Multi-Robot Leader-Follower Formations in Obstacle-Laden Environments , Zachary J. Hinnen

Secure Lightweight Cryptographic Hardware Constructions for Deeply Embedded Systems , Jasmin Kaur

A Psychometric Analysis of Natural Language Inference Using Transformer Language Models , Antonio Laverghetta Jr.

Graph Analysis on Social Networks , Shen Lu

Deep Learning-based Automatic Stereology for High- and Low-magnification Images , Hunter Morera

Deciphering Trends and Tactics: Data-driven Techniques for Forecasting Information Spread and Detecting Coordinated Campaigns in Social Media , Kin Wai Ng Lugo

Deciphering Trends and Tactics: Data-driven Techniques for Forecasting Information Spread and Detecting Coordinated Campaigns in Social Media , Kin Wai NG Lugo

Automated Approaches to Enable Innovative Civic Applications from Citizen Generated Imagery , Hye Seon Yi

Theses/Dissertations from 2022 2022

Towards High Performing and Reliable Deep Convolutional Neural Network Models for Typically Limited Medical Imaging Datasets , Kaoutar Ben Ahmed

Task Progress Assessment and Monitoring Using Self-Supervised Learning , Sainath Reddy Bobbala

Towards More Task-Generalized and Explainable AI Through Psychometrics , Alec Braynen

A Multiple Input Multiple Output Framework for the Automatic Optical Fractionator-based Cell Counting in Z-Stacks Using Deep Learning , Palak Dave

On the Reliability of Wearable Sensors for Assessing Movement Disorder-Related Gait Quality and Imbalance: A Case Study of Multiple Sclerosis , Steven Díaz Hernández

Securing Critical Cyber Infrastructures and Functionalities via Machine Learning Empowered Strategies , Tao Hou

Social Media Time Series Forecasting and User-Level Activity Prediction with Gradient Boosting, Deep Learning, and Data Augmentation , Fred Mubang

A Study of Deep Learning Silhouette Extractors for Gait Recognition , Sneha Oladhri

Analyzing Decision-making in Robot Soccer for Attacking Behaviors , Justin Rodney

Generative Spatio-Temporal and Multimodal Analysis of Neonatal Pain , Md Sirajus Salekin

Secure Hardware Constructions for Fault Detection of Lattice-based Post-quantum Cryptosystems , Ausmita Sarker

Adaptive Multi-scale Place Cell Representations and Replay for Spatial Navigation and Learning in Autonomous Robots , Pablo Scleidorovich

Predicting the Number of Objects in a Robotic Grasp , Utkarsh Tamrakar

Humanoid Robot Motion Control for Ramps and Stairs , Tommy Truong

Preventing Variadic Function Attacks Through Argument Width Counting , Brennan Ward

Theses/Dissertations from 2021 2021

Knowledge Extraction and Inference Based on Visual Understanding of Cooking Contents , Ahmad Babaeian Babaeian Jelodar

Efficient Post-Quantum and Compact Cryptographic Constructions for the Internet of Things , Rouzbeh Behnia

Efficient Hardware Constructions for Error Detection of Post-Quantum Cryptographic Schemes , Alvaro Cintas Canto

Using Hyper-Dimensional Spanning Trees to Improve Structure Preservation During Dimensionality Reduction , Curtis Thomas Davis

Design, Deployment, and Validation of Computer Vision Techniques for Societal Scale Applications , Arup Kanti Dey

AffectiveTDA: Using Topological Data Analysis to Improve Analysis and Explainability in Affective Computing , Hamza Elhamdadi

Automatic Detection of Vehicles in Satellite Images for Economic Monitoring , Cole Hill

Analysis of Contextual Emotions Using Multimodal Data , Saurabh Hinduja

Data-driven Studies on Social Networks: Privacy and Simulation , Yasanka Sameera Horawalavithana

Automated Identification of Stages in Gonotrophic Cycle of Mosquitoes Using Computer Vision Techniques , Sherzod Kariev

Exploring the Use of Neural Transformers for Psycholinguistics , Antonio Laverghetta Jr.

Secure VLSI Hardware Design Against Intellectual Property (IP) Theft and Cryptographic Vulnerabilities , Matthew Dean Lewandowski

Turkic Interlingua: A Case Study of Machine Translation in Low-resource Languages , Jamshidbek Mirzakhalov

Automated Wound Segmentation and Dimension Measurement Using RGB-D Image , Chih-Yun Pai

Constructing Frameworks for Task-Optimized Visualizations , Ghulam Jilani Abdul Rahim Quadri

Trilateration-Based Localization in Known Environments with Object Detection , Valeria M. Salas Pacheco

Recognizing Patterns from Vital Signs Using Spectrograms , Sidharth Srivatsav Sribhashyam

Recognizing Emotion in the Wild Using Multimodal Data , Shivam Srivastava

A Modular Framework for Multi-Rotor Unmanned Aerial Vehicles for Military Operations , Dante Tezza

Human-centered Cybersecurity Research — Anthropological Findings from Two Longitudinal Studies , Anwesh Tuladhar

Learning State-Dependent Sensor Measurement Models To Improve Robot Localization Accuracy , Troi André Williams

Human-centric Cybersecurity Research: From Trapping the Bad Guys to Helping the Good Ones , Armin Ziaie Tabari

Theses/Dissertations from 2020 2020

Classifying Emotions with EEG and Peripheral Physiological Data Using 1D Convolutional Long Short-Term Memory Neural Network , Rupal Agarwal

Keyless Anti-Jamming Communication via Randomized DSSS , Ahmad Alagil

Active Deep Learning Method to Automate Unbiased Stereology Cell Counting , Saeed Alahmari

Composition of Atomic-Obligation Security Policies , Yan Cao Albright

Action Recognition Using the Motion Taxonomy , Maxat Alibayev

Sentiment Analysis in Peer Review , Zachariah J. Beasley

Spatial Heterogeneity Utilization in CT Images for Lung Nodule Classication , Dmitrii Cherezov

Feature Selection Via Random Subsets Of Uncorrelated Features , Long Kim Dang

Unifying Security Policy Enforcement: Theory and Practice , Shamaria Engram

PsiDB: A Framework for Batched Query Processing and Optimization , Mehrad Eslami

Composition of Atomic-Obligation Security Policies , Danielle Ferguson

Algorithms To Profile Driver Behavior From Zero-permission Embedded Sensors , Bharti Goel

The Efficiency and Accuracy of YOLO for Neonate Face Detection in the Clinical Setting , Jacqueline Hausmann

Beyond the Hype: Challenges of Neural Networks as Applied to Social Networks , Anthony Hernandez

Privacy-Preserving and Functional Information Systems , Thang Hoang

Managing Off-Grid Power Use for Solar Fueled Residences with Smart Appliances, Prices-to-Devices and IoT , Donnelle L. January

Novel Bit-Sliced In-Memory Computing Based VLSI Architecture for Fast Sobel Edge Detection in IoT Edge Devices , Rajeev Joshi

Edge Computing for Deep Learning-Based Distributed Real-time Object Detection on IoT Constrained Platforms at Low Frame Rate , Lakshmikavya Kalyanam

Establishing Topological Data Analysis: A Comparison of Visualization Techniques , Tanmay J. Kotha

Machine Learning for the Internet of Things: Applications, Implementation, and Security , Vishalini Laguduva Ramnath

System Support of Concurrent Database Query Processing on a GPU , Hao Li

Deep Learning Predictive Modeling with Data Challenges (Small, Big, or Imbalanced) , Renhao Liu

Countermeasures Against Various Network Attacks Using Machine Learning Methods , Yi Li

Towards Safe Power Oversubscription and Energy Efficiency of Data Centers , Sulav Malla

Design of Support Measures for Counting Frequent Patterns in Graphs , Jinghan Meng

Automating the Classification of Mosquito Specimens Using Image Processing Techniques , Mona Minakshi

Models of Secure Software Enforcement and Development , Hernan M. Palombo

Functional Object-Oriented Network: A Knowledge Representation for Service Robotics , David Andrés Paulius Ramos

Lung Nodule Malignancy Prediction from Computed Tomography Images Using Deep Learning , Rahul Paul

Algorithms and Framework for Computing 2-body Statistics on Graphics Processing Units , Napath Pitaksirianan

Efficient Viewshed Computation Algorithms On GPUs and CPUs , Faisal F. Qarah

Relational Joins on GPUs for In-Memory Database Query Processing , Ran Rui

Micro-architectural Countermeasures for Control Flow and Misspeculation Based Software Attacks , Love Kumar Sah

Efficient Forward-Secure and Compact Signatures for the Internet of Things (IoT) , Efe Ulas Akay Seyitoglu

Detecting Symptoms of Chronic Obstructive Pulmonary Disease and Congestive Heart Failure via Cough and Wheezing Sounds Using Smart-Phones and Machine Learning , Anthony Windmon

Toward Culturally Relevant Emotion Detection Using Physiological Signals , Khadija Zanna

Theses/Dissertations from 2019 2019

Beyond Labels and Captions: Contextualizing Grounded Semantics for Explainable Visual Interpretation , Sathyanarayanan Narasimhan Aakur

Empirical Analysis of a Cybersecurity Scoring System , Jaleel Ahmed

Phenomena of Social Dynamics in Online Games , Essa Alhazmi

A Machine Learning Approach to Predicting Community Engagement on Social Media During Disasters , Adel Alshehri

Interactive Fitness Domains in Competitive Coevolutionary Algorithm , ATM Golam Bari

Measuring Influence Across Social Media Platforms: Empirical Analysis Using Symbolic Transfer Entropy , Abhishek Bhattacharjee

A Communication-Centric Framework for Post-Silicon System-on-chip Integration Debug , Yuting Cao

Authentication and SQL-Injection Prevention Techniques in Web Applications , Cagri Cetin

Multimodal Emotion Recognition Using 3D Facial Landmarks, Action Units, and Physiological Data , Diego Fabiano

Robotic Motion Generation by Using Spatial-Temporal Patterns from Human Demonstrations , Yongqiang Huang

Advanced Search

  • Email Notifications and RSS
  • All Collections
  • USF Faculty Publications
  • Open Access Journals
  • Conferences and Events
  • Theses and Dissertations
  • Textbooks Collection

Useful Links

  • Rights Information
  • SelectedWorks
  • Submit Research

Home | About | Help | My Account | Accessibility Statement | Language and Diversity Statements

Privacy Copyright

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals

Computer science articles from across Nature Portfolio

Computer science is the study and development of the protocols required for automated processing and manipulation of data. This includes, for example, creating algorithms for efficiently searching large volumes of information or encrypting data so that it can be stored and transmitted securely.

Latest Research and Reviews

research paper topics about computer engineering

Linear matrix genetic programming as a tool for data-driven black-box control-oriented modeling in conditions of limited access to training data

  • Tomasz Praczyk
  • Maciej Szymkowiak

research paper topics about computer engineering

On-device query intent prediction with lightweight LLMs to support ubiquitous conversations

  • Mateusz Dubiel
  • Yasmine Barghouti
  • Luis A. Leiva

research paper topics about computer engineering

A robust self-supervised approach for fine-grained crack detection in concrete structures

  • Muhammad Sohaib
  • Md Junayed Hasan
  • Zhonglong Zheng

research paper topics about computer engineering

Visibility forecast in Jiangsu province based on the GCN-GRU model

  • Huansang Chen

research paper topics about computer engineering

A multi-strategy improved rime optimization algorithm for three-dimensional USV path planning and global optimization

  • Jingjun Lou

research paper topics about computer engineering

Development and deployment of a histopathology-based deep learning algorithm for patient prescreening in a clinical trial

Here, the authors develop a deep-learning algorithm to predict biomarkers from histopathological imaging in advanced urothelial cancer patients. This method detects suitable patients for targeted therapy clinical trials with a significant reduction in molecular testing, providing cost and time savings in real-world clinical settings.

  • Albert Juan Ramon
  • Chaitanya Parmar
  • Kristopher A. Standish

Advertisement

News and Comment

research paper topics about computer engineering

Who owns your voice? Scarlett Johansson OpenAI complaint raises questions

In the age of artificial intelligence, situations are emerging that challenge the laws over rights to a persona.

  • Nicola Jones

Anglo-American bias could make generative AI an invisible intellectual cage

  • Queenie Luo
  • Michael Puett

research paper topics about computer engineering

AlphaFold3 — why did Nature publish it without its code?

Criticism of our decision to publish AlphaFold3 raises important questions. We welcome readers’ views.

research paper topics about computer engineering

Back to basics to open the black box

Most research efforts in machine learning focus on performance and are detached from an explanation of the behaviour of the model. We call for going back to basics of machine learning methods, with more focus on the development of a basic understanding grounded in statistical theory.

  • Diego Marcondes
  • Adilson Simonis
  • Junior Barrera

research paper topics about computer engineering

Quantum computing for oncology

As quantum technology advances, it holds immense potential to accelerate oncology discovery through enhanced molecular modeling, genomic analysis, medical imaging, and quantum sensing.

  • Siddhi Ramesh
  • Teague Tomesh
  • Alexander T. Pearson

research paper topics about computer engineering

Autonomous interference-avoiding machine-to-machine communications

An article in IEEE Journal on Selected Areas in Communications proposes algorithmic solutions to dynamically optimize MIMO waveforms to minimize or eliminate interference in autonomous machine-to-machine communications.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research paper topics about computer engineering

Princeton University

  • Advisers & Contacts
  • Bachelor of Arts & Bachelor of Science in Engineering
  • Prerequisites
  • Declaring Computer Science for AB Students
  • Declaring Computer Science for BSE Students
  • Class of '25, '26 & '27 - Departmental Requirements
  • Class of 2024 - Departmental Requirements
  • COS126 Information
  • Important Steps and Deadlines
  • Independent Work Seminars
  • Guidelines and Useful Information

Undergraduate Research Topics

  • AB Junior Research Workshops
  • Undergraduate Program FAQ
  • How to Enroll
  • Requirements
  • Certificate Program FAQ
  • Interdepartmental Committee
  • Minor Program
  • Funding for Student Group Activities
  • Mailing Lists and Policies
  • Study Abroad
  • Jobs & Careers
  • Admissions Requirements
  • Breadth Requirements
  • Pre-FPO Checklist
  • FPO Checklist
  • M.S.E. Track
  • M.Eng. Track
  • Departmental Internship Policy (for Master's students)
  • General Examination
  • Fellowship Opportunities
  • Travel Reimbursement Policy
  • Communication Skills
  • Course Schedule
  • Course Catalog
  • Research Areas
  • Interdisciplinary Programs
  • Technical Reports
  • Computing Facilities
  • Researchers
  • Technical Staff
  • Administrative Staff
  • Graduate Students
  • Undergraduate Students
  • Graduate Alumni
  • Climate and Inclusion Committee
  • Resources for Undergraduate & Graduate Students
  • Outreach Initiatives
  • Resources for Faculty & Staff
  • Spotlight Stories
  • Job Openings
  • Undergraduate Program
  • Independent Work & Theses

Suggested Undergraduate Research Topics

research paper topics about computer engineering

How to Contact Faculty for IW/Thesis Advising

Send the professor an e-mail. When you write a professor, be clear that you want a meeting regarding a senior thesis or one-on-one IW project, and briefly describe the topic or idea that you want to work on. Check the faculty listing for email addresses.

Parastoo Abtahi, Room 419

Available for single-semester IW and senior thesis advising, 2024-2025

  • Research Areas: Human-Computer Interaction (HCI), Augmented Reality (AR), and Spatial Computing
  • Input techniques for on-the-go interaction (e.g., eye-gaze, microgestures, voice) with a focus on uncertainty, disambiguation, and privacy.
  • Minimal and timely multisensory output (e.g., spatial audio, haptics) that enables users to attend to their physical environment and the people around them, instead of a 2D screen.
  • Interaction with intelligent systems (e.g., IoT, robots) situated in physical spaces with a focus on updating users’ mental model despite the complexity and dynamicity of these systems.

Ryan Adams, Room 411

Research areas:

  • Machine learning driven design
  • Generative models for structured discrete objects
  • Approximate inference in probabilistic models
  • Accelerating solutions to partial differential equations
  • Innovative uses of automatic differentiation
  • Modeling and optimizing 3d printing and CNC machining

Andrew Appel, Room 209

Available for Fall 2024 IW advising, only

  • Research Areas: Formal methods, programming languages, compilers, computer security.
  • Software verification (for which taking COS 326 / COS 510 is helpful preparation)
  • Game theory of poker or other games (for which COS 217 / 226 are helpful)
  • Computer game-playing programs (for which COS 217 / 226)
  •  Risk-limiting audits of elections (for which ORF 245 or other knowledge of probability is useful)

Sanjeev Arora, Room 407

  • Theoretical machine learning, deep learning and its analysis, natural language processing. My advisees would typically have taken a course in algorithms (COS423 or COS 521 or equivalent) and a course in machine learning.
  • Show that finding approximate solutions to NP-complete problems is also NP-complete (i.e., come up with NP-completeness reductions a la COS 487). 
  • Experimental Algorithms: Implementing and Evaluating Algorithms using existing software packages. 
  • Studying/designing provable algorithms for machine learning and implementions using packages like scipy and MATLAB, including applications in Natural language processing and deep learning.
  • Any topic in theoretical computer science.

David August, Room 221

Not available for IW or thesis advising, 2024-2025

  • Research Areas: Computer Architecture, Compilers, Parallelism
  • Containment-based approaches to security:  We have designed and tested a simple hardware+software containment mechanism that stops incorrect communication resulting from faults, bugs, or exploits from leaving the system.   Let's explore ways to use containment to solve real problems.  Expect to work with corporate security and technology decision-makers.
  • Parallelism: Studies show much more parallelism than is currently realized in compilers and architectures.  Let's find ways to realize this parallelism.
  • Any other interesting topic in computer architecture or compilers. 

Mark Braverman, 194 Nassau St., Room 231

  • Research Areas: computational complexity, algorithms, applied probability, computability over the real numbers, game theory and mechanism design, information theory.
  • Topics in computational and communication complexity.
  • Applications of information theory in complexity theory.
  • Algorithms for problems under real-life assumptions.
  • Game theory, network effects
  • Mechanism design (could be on a problem proposed by the student)

Sebastian Caldas, 221 Nassau Street, Room 105

  • Research Areas: collaborative learning, machine learning for healthcare. Typically, I will work with students that have taken COS324.
  • Methods for collaborative and continual learning.
  • Machine learning for healthcare applications.

Bernard Chazelle, 194 Nassau St., Room 301

  • Research Areas: Natural Algorithms, Computational Geometry, Sublinear Algorithms. 
  • Natural algorithms (flocking, swarming, social networks, etc).
  • Sublinear algorithms
  • Self-improving algorithms
  • Markov data structures

Danqi Chen, Room 412

  • My advisees would be expected to have taken a course in machine learning and ideally have taken COS484 or an NLP graduate seminar.
  • Representation learning for text and knowledge bases
  • Pre-training and transfer learning
  • Question answering and reading comprehension
  • Information extraction
  • Text summarization
  • Any other interesting topics related to natural language understanding/generation

Marcel Dall'Agnol, Corwin 034

  • Research Areas: Theoretical computer science. (Specifically, quantum computation, sublinear algorithms, complexity theory, interactive proofs and cryptography)
  • Research Areas: Machine learning

Jia Deng, Room 423

  •  Research Areas: Computer Vision, Machine Learning.
  • Object recognition and action recognition
  • Deep Learning, autoML, meta-learning
  • Geometric reasoning, logical reasoning

Adji Bousso Dieng, Room 406

  • Research areas: Vertaix is a research lab at Princeton University led by Professor Adji Bousso Dieng. We work at the intersection of artificial intelligence (AI) and the natural sciences. The models and algorithms we develop are motivated by problems in those domains and contribute to advancing methodological research in AI. We leverage tools in statistical machine learning and deep learning in developing methods for learning with the data, of various modalities, arising from the natural sciences.

Robert Dondero, Corwin Hall, Room 038

  • Research Areas:  Software engineering; software engineering education.
  • Develop or evaluate tools to facilitate student learning in undergraduate computer science courses at Princeton, and beyond.
  • In particular, can code critiquing tools help students learn about software quality?

Zeev Dvir, 194 Nassau St., Room 250

  • Research Areas: computational complexity, pseudo-randomness, coding theory and discrete mathematics.
  • Independent Research: I have various research problems related to Pseudorandomness, Coding theory, Complexity and Discrete mathematics - all of which require strong mathematical background. A project could also be based on writing a survey paper describing results from a few theory papers revolving around some particular subject.

Benjamin Eysenbach, Room 416

  • Research areas: reinforcement learning, machine learning. My advisees would typically have taken COS324.
  • Using RL algorithms to applications in science and engineering.
  • Emergent behavior of RL algorithms on high-fidelity robotic simulators.
  • Studying how architectures and representations can facilitate generalization.

Christiane Fellbaum, 1-S-14 Green

  • Research Areas: theoretical and computational linguistics, word sense disambiguation, lexical resource construction, English and multilingual WordNet(s), ontology
  • Anything having to do with natural language--come and see me with/for ideas suitable to your background and interests. Some topics students have worked on in the past:
  • Developing parsers, part-of-speech taggers, morphological analyzers for underrepresented languages (you don't have to know the language to develop such tools!)
  • Quantitative approaches to theoretical linguistics questions
  • Extensions and interfaces for WordNet (English and WN in other languages),
  • Applications of WordNet(s), including:
  • Foreign language tutoring systems,
  • Spelling correction software,
  • Word-finding/suggestion software for ordinary users and people with memory problems,
  • Machine Translation 
  • Sentiment and Opinion detection
  • Automatic reasoning and inferencing
  • Collaboration with professors in the social sciences and humanities ("Digital Humanities")

Adam Finkelstein, Room 424 

  • Research Areas: computer graphics, audio.

Robert S. Fish, Corwin Hall, Room 037

  • Networking and telecommunications
  • Learning, perception, and intelligence, artificial and otherwise;
  • Human-computer interaction and computer-supported cooperative work
  • Online education, especially in Computer Science Education
  • Topics in research and development innovation methodologies including standards, open-source, and entrepreneurship
  • Distributed autonomous organizations and related blockchain technologies

Michael Freedman, Room 308 

  • Research Areas: Distributed systems, security, networking
  • Projects related to streaming data analysis, datacenter systems and networks, untrusted cloud storage and applications. Please see my group website at http://sns.cs.princeton.edu/ for current research projects.

Ruth Fong, Room 032

  • Research Areas: computer vision, machine learning, deep learning, interpretability, explainable AI, fairness and bias in AI
  • Develop a technique for understanding AI models
  • Design a AI model that is interpretable by design
  • Build a paradigm for detecting and/or correcting failure points in an AI model
  • Analyze an existing AI model and/or dataset to better understand its failure points
  • Build a computer vision system for another domain (e.g., medical imaging, satellite data, etc.)
  • Develop a software package for explainable AI
  • Adapt explainable AI research to a consumer-facing problem

Note: I am happy to advise any project if there's a sufficient overlap in interest and/or expertise; please reach out via email to chat about project ideas.

Tom Griffiths, Room 405

Available for Fall 2024 single-semester IW advising, only

Research areas: computational cognitive science, computational social science, machine learning and artificial intelligence

Note: I am open to projects that apply ideas from computer science to understanding aspects of human cognition in a wide range of areas, from decision-making to cultural evolution and everything in between. For example, we have current projects analyzing chess game data and magic tricks, both of which give us clues about how human minds work. Students who have expertise or access to data related to games, magic, strategic sports like fencing, or other quantifiable domains of human behavior feel free to get in touch.

Aarti Gupta, Room 220

  • Research Areas: Formal methods, program analysis, logic decision procedures
  • Finding bugs in open source software using automatic verification tools
  • Software verification (program analysis, model checking, test generation)
  • Decision procedures for logical reasoning (SAT solvers, SMT solvers)

Elad Hazan, Room 409  

  • Research interests: machine learning methods and algorithms, efficient methods for mathematical optimization, regret minimization in games, reinforcement learning, control theory and practice
  • Machine learning, efficient methods for mathematical optimization, statistical and computational learning theory, regret minimization in games.
  • Implementation and algorithm engineering for control, reinforcement learning and robotics
  • Implementation and algorithm engineering for time series prediction

Felix Heide, Room 410

  • Research Areas: Computational Imaging, Computer Vision, Machine Learning (focus on Optimization and Approximate Inference).
  • Optical Neural Networks
  • Hardware-in-the-loop Holography
  • Zero-shot and Simulation-only Learning
  • Object recognition in extreme conditions
  • 3D Scene Representations for View Generation and Inverse Problems
  • Long-range Imaging in Scattering Media
  • Hardware-in-the-loop Illumination and Sensor Optimization
  • Inverse Lidar Design
  • Phase Retrieval Algorithms
  • Proximal Algorithms for Learning and Inference
  • Domain-Specific Language for Optics Design

Peter Henderson , 302 Sherrerd Hall

  • Research Areas: Machine learning, law, and policy

Kyle Jamieson, Room 306

  • Research areas: Wireless and mobile networking; indoor radar and indoor localization; Internet of Things
  • See other topics on my independent work  ideas page  (campus IP and CS dept. login req'd)

Alan Kaplan, 221 Nassau Street, Room 105

Research Areas:

  • Random apps of kindness - mobile application/technology frameworks used to help individuals or communities; topic areas include, but are not limited to: first response, accessibility, environment, sustainability, social activism, civic computing, tele-health, remote learning, crowdsourcing, etc.
  • Tools automating programming language interoperability - Java/C++, React Native/Java, etc.
  • Software visualization tools for education
  • Connected consumer devices, applications and protocols

Brian Kernighan, Room 311

  • Research Areas: application-specific languages, document preparation, user interfaces, software tools, programming methodology
  • Application-oriented languages, scripting languages.
  • Tools; user interfaces
  • Digital humanities

Zachary Kincaid, Room 219

  • Research areas: programming languages, program analysis, program verification, automated reasoning
  • Independent Research Topics:
  • Develop a practical algorithm for an intractable problem (e.g., by developing practical search heuristics, or by reducing to, or by identifying a tractable sub-problem, ...).
  • Design a domain-specific programming language, or prototype a new feature for an existing language.
  • Any interesting project related to programming languages or logic.

Gillat Kol, Room 316

  • Research area: theory

Aleksandra Korolova, 309 Sherrerd Hall

  • Research areas: Societal impacts of algorithms and AI; privacy; fair and privacy-preserving machine learning; algorithm auditing.

Advisees typically have taken one or more of COS 226, COS 324, COS 423, COS 424 or COS 445.

Pravesh Kothari, Room 320

  • Research areas: Theory

Amit Levy, Room 307

  • Research Areas: Operating Systems, Distributed Systems, Embedded Systems, Internet of Things
  • Distributed hardware testing infrastructure
  • Second factor security tokens
  • Low-power wireless network protocol implementation
  • USB device driver implementation

Kai Li, Room 321

  • Research Areas: Distributed systems; storage systems; content-based search and data analysis of large datasets.
  • Fast communication mechanisms for heterogeneous clusters.
  • Approximate nearest-neighbor search for high dimensional data.
  • Data analysis and prediction of in-patient medical data.
  • Optimized implementation of classification algorithms on manycore processors.

Xiaoyan Li, 221 Nassau Street, Room 104

  • Research areas: Information retrieval, novelty detection, question answering, AI, machine learning and data analysis.
  • Explore new statistical retrieval models for document retrieval and question answering.
  • Apply AI in various fields.
  • Apply supervised or unsupervised learning in health, education, finance, and social networks, etc.
  • Any interesting project related to AI, machine learning, and data analysis.

Lydia Liu, Room 414

  • Research Areas: algorithmic decision making, machine learning and society
  • Theoretical foundations for algorithmic decision making (e.g. mathematical modeling of data-driven decision processes, societal level dynamics)
  • Societal impacts of algorithms and AI through a socio-technical lens (e.g. normative implications of worst case ML metrics, prediction and model arbitrariness)
  • Machine learning for social impact domains, especially education (e.g. responsible development and use of LLMs for education equity and access)
  • Evaluation of human-AI decision making using statistical methods (e.g. causal inference of long term impact)

Wyatt Lloyd, Room 323

  • Research areas: Distributed Systems
  • Caching algorithms and implementations
  • Storage systems
  • Distributed transaction algorithms and implementations

Alex Lombardi , Room 312

  • Research Areas: Theory

Margaret Martonosi, Room 208

  • Quantum Computing research, particularly related to architecture and compiler issues for QC.
  • Computer architectures specialized for modern workloads (e.g., graph analytics, machine learning algorithms, mobile applications
  • Investigating security and privacy vulnerabilities in computer systems, particularly IoT devices.
  • Other topics in computer architecture or mobile / IoT systems also possible.

Jonathan Mayer, Sherrerd Hall, Room 307 

Available for Spring 2025 single-semester IW, only

  • Research areas: Technology law and policy, with emphasis on national security, criminal procedure, consumer privacy, network management, and online speech.
  • Assessing the effects of government policies, both in the public and private sectors.
  • Collecting new data that relates to government decision making, including surveying current business practices and studying user behavior.
  • Developing new tools to improve government processes and offer policy alternatives.

Mae Milano, Room 307

  • Local-first / peer-to-peer systems
  • Wide-ares storage systems
  • Consistency and protocol design
  • Type-safe concurrency
  • Language design
  • Gradual typing
  • Domain-specific languages
  • Languages for distributed systems

Andrés Monroy-Hernández, Room 405

  • Research Areas: Human-Computer Interaction, Social Computing, Public-Interest Technology, Augmented Reality, Urban Computing
  • Research interests:developing public-interest socio-technical systems.  We are currently creating alternatives to gig work platforms that are more equitable for all stakeholders. For instance, we are investigating the socio-technical affordances necessary to support a co-op food delivery network owned and managed by workers and restaurants. We are exploring novel system designs that support self-governance, decentralized/federated models, community-centered data ownership, and portable reputation systems.  We have opportunities for students interested in human-centered computing, UI/UX design, full-stack software development, and qualitative/quantitative user research.
  • Beyond our core projects, we are open to working on research projects that explore the use of emerging technologies, such as AR, wearables, NFTs, and DAOs, for creative and out-of-the-box applications.

Christopher Moretti, Corwin Hall, Room 036

  • Research areas: Distributed systems, high-throughput computing, computer science/engineering education
  • Expansion, improvement, and evaluation of open-source distributed computing software.
  • Applications of distributed computing for "big science" (e.g. biometrics, data mining, bioinformatics)
  • Software and best practices for computer science education and study, especially Princeton's 126/217/226 sequence or MOOCs development
  • Sports analytics and/or crowd-sourced computing

Radhika Nagpal, F316 Engineering Quadrangle

  • Research areas: control, robotics and dynamical systems

Karthik Narasimhan, Room 422

  • Research areas: Natural Language Processing, Reinforcement Learning
  • Autonomous agents for text-based games ( https://www.microsoft.com/en-us/research/project/textworld/ )
  • Transfer learning/generalization in NLP
  • Techniques for generating natural language
  • Model-based reinforcement learning

Arvind Narayanan, 308 Sherrerd Hall 

Research Areas: fair machine learning (and AI ethics more broadly), the social impact of algorithmic systems, tech policy

Pedro Paredes, Corwin Hall, Room 041

My primary research work is in Theoretical Computer Science.

 * Research Interest: Spectral Graph theory, Pseudorandomness, Complexity theory, Coding Theory, Quantum Information Theory, Combinatorics.

The IW projects I am interested in advising can be divided into three categories:

 1. Theoretical research

I am open to advise work on research projects in any topic in one of my research areas of interest. A project could also be based on writing a survey given results from a few papers. Students should have a solid background in math (e.g., elementary combinatorics, graph theory, discrete probability, basic algebra/calculus) and theoretical computer science (226 and 240 material, like big-O/Omega/Theta, basic complexity theory, basic fundamental algorithms). Mathematical maturity is a must.

A (non exhaustive) list of topics of projects I'm interested in:   * Explicit constructions of better vertex expanders and/or unique neighbor expanders.   * Construction deterministic or random high dimensional expanders.   * Pseudorandom generators for different problems.   * Topics around the quantum PCP conjecture.   * Topics around quantum error correcting codes and locally testable codes, including constructions, encoding and decoding algorithms.

 2. Theory informed practical implementations of algorithms   Very often the great advances in theoretical research are either not tested in practice or not even feasible to be implemented in practice. Thus, I am interested in any project that consists in trying to make theoretical ideas applicable in practice. This includes coming up with new algorithms that trade some theoretical guarantees for feasible implementation yet trying to retain the soul of the original idea; implementing new algorithms in a suitable programming language; and empirically testing practical implementations and comparing them with benchmarks / theoretical expectations. A project in this area doesn't have to be in my main areas of research, any theoretical result could be suitable for such a project.

Some examples of areas of interest:   * Streaming algorithms.   * Numeric linear algebra.   * Property testing.   * Parallel / Distributed algorithms.   * Online algorithms.    3. Machine learning with a theoretical foundation

I am interested in projects in machine learning that have some mathematical/theoretical, even if most of the project is applied. This includes topics like mathematical optimization, statistical learning, fairness and privacy.

One particular area I have been recently interested in is in the area of rating systems (e.g., Chess elo) and applications of this to experts problems.

Final Note: I am also willing to advise any project with any mathematical/theoretical component, even if it's not the main one; please reach out via email to chat about project ideas.

Iasonas Petras, Corwin Hall, Room 033

  • Research Areas: Information Based Complexity, Numerical Analysis, Quantum Computation.
  • Prerequisites: Reasonable mathematical maturity. In case of a project related to Quantum Computation a certain familiarity with quantum mechanics is required (related courses: ELE 396/PHY 208).
  • Possible research topics include:

1.   Quantum algorithms and circuits:

  • i. Design or simulation quantum circuits implementing quantum algorithms.
  • ii. Design of quantum algorithms solving/approximating continuous problems (such as Eigenvalue problems for Partial Differential Equations).

2.   Information Based Complexity:

  • i. Necessary and sufficient conditions for tractability of Linear and Linear Tensor Product Problems in various settings (for example worst case or average case). 
  • ii. Necessary and sufficient conditions for tractability of Linear and Linear Tensor Product Problems under new tractability and error criteria.
  • iii. Necessary and sufficient conditions for tractability of Weighted problems.
  • iv. Necessary and sufficient conditions for tractability of Weighted Problems under new tractability and error criteria.

3. Topics in Scientific Computation:

  • i. Randomness, Pseudorandomness, MC and QMC methods and their applications (Finance, etc)

Yuri Pritykin, 245 Carl Icahn Lab

  • Research interests: Computational biology; Cancer immunology; Regulation of gene expression; Functional genomics; Single-cell technologies.
  • Potential research projects: Development, implementation, assessment and/or application of algorithms for analysis, integration, interpretation and visualization of multi-dimensional data in molecular biology, particularly single-cell and spatial genomics data.

Benjamin Raphael, Room 309  

  • Research interests: Computational biology and bioinformatics; Cancer genomics; Algorithms and machine learning approaches for analysis of large-scale datasets
  • Implementation and application of algorithms to infer evolutionary processes in cancer
  • Identifying correlations between combinations of genomic mutations in human and cancer genomes
  • Design and implementation of algorithms for genome sequencing from new DNA sequencing technologies
  • Graph clustering and network anomaly detection, particularly using diffusion processes and methods from spectral graph theory

Vikram Ramaswamy, 035 Corwin Hall

  • Research areas: Interpretability of AI systems, Fairness in AI systems, Computer vision.
  • Constructing a new method to explain a model / create an interpretable by design model
  • Analyzing a current model / dataset to understand bias within the model/dataset
  • Proposing new fairness evaluations
  • Proposing new methods to train to improve fairness
  • Developing synthetic datasets for fairness / interpretability benchmarks
  • Understanding robustness of models

Ran Raz, Room 240

  • Research Area: Computational Complexity
  • Independent Research Topics: Computational Complexity, Information Theory, Quantum Computation, Theoretical Computer Science

Szymon Rusinkiewicz, Room 406

  • Research Areas: computer graphics; computer vision; 3D scanning; 3D printing; robotics; documentation and visualization of cultural heritage artifacts
  • Research ways of incorporating rotation invariance into computer visiontasks such as feature matching and classification
  • Investigate approaches to robust 3D scan matching
  • Model and compensate for imperfections in 3D printing
  • Given a collection of small mobile robots, apply control policies learned in simulation to the real robots.

Olga Russakovsky, Room 408

  • Research Areas: computer vision, machine learning, deep learning, crowdsourcing, fairness&bias in AI
  • Design a semantic segmentation deep learning model that can operate in a zero-shot setting (i.e., recognize and segment objects not seen during training)
  • Develop a deep learning classifier that is impervious to protected attributes (such as gender or race) that may be erroneously correlated with target classes
  • Build a computer vision system for the novel task of inferring what object (or part of an object) a human is referring to when pointing to a single pixel in the image. This includes both collecting an appropriate dataset using crowdsourcing on Amazon Mechanical Turk, creating a new deep learning formulation for this task, and running extensive analysis of both the data and the model

Sebastian Seung, Princeton Neuroscience Institute, Room 153

  • Research Areas: computational neuroscience, connectomics, "deep learning" neural networks, social computing, crowdsourcing, citizen science
  • Gamification of neuroscience (EyeWire  2.0)
  • Semantic segmentation and object detection in brain images from microscopy
  • Computational analysis of brain structure and function
  • Neural network theories of brain function

Jaswinder Pal Singh, Room 324

  • Research Areas: Boundary of technology and business/applications; building and scaling technology companies with special focus at that boundary; parallel computing systems and applications: parallel and distributed applications and their implications for software and architectural design; system software and programming environments for multiprocessors.
  • Develop a startup company idea, and build a plan/prototype for it.
  • Explore tradeoffs at the boundary of technology/product and business/applications in a chosen area.
  • Study and develop methods to infer insights from data in different application areas, from science to search to finance to others. 
  • Design and implement a parallel application. Possible areas include graphics, compression, biology, among many others. Analyze performance bottlenecks using existing tools, and compare programming models/languages.
  • Design and implement a scalable distributed algorithm.

Mona Singh, Room 420

  • Research Areas: computational molecular biology, as well as its interface with machine learning and algorithms.
  • Whole and cross-genome methods for predicting protein function and protein-protein interactions.
  • Analysis and prediction of biological networks.
  • Computational methods for inferring specific aspects of protein structure from protein sequence data.
  • Any other interesting project in computational molecular biology.

Robert Tarjan, 194 Nassau St., Room 308

  • Research Areas: Data structures; graph algorithms; combinatorial optimization; computational complexity; computational geometry; parallel algorithms.
  • Implement one or more data structures or combinatorial algorithms to provide insight into their empirical behavior.
  • Design and/or analyze various data structures and combinatorial algorithms.

Olga Troyanskaya, Room 320

  • Research Areas: Bioinformatics; analysis of large-scale biological data sets (genomics, gene expression, proteomics, biological networks); algorithms for integration of data from multiple data sources; visualization of biological data; machine learning methods in bioinformatics.
  • Implement and evaluate one or more gene expression analysis algorithm.
  • Develop algorithms for assessment of performance of genomic analysis methods.
  • Develop, implement, and evaluate visualization tools for heterogeneous biological data.

David Walker, Room 211

  • Research Areas: Programming languages, type systems, compilers, domain-specific languages, software-defined networking and security
  • Independent Research Topics:  Any other interesting project that involves humanitarian hacking, functional programming, domain-specific programming languages, type systems, compilers, software-defined networking, fault tolerance, language-based security, theorem proving, logic or logical frameworks.

Shengyi Wang, Postdoctoral Research Associate, Room 216

Available for Fall 2024 single-semester IW, only

  • Independent Research topics: Explore Escher-style tilings using (introductory) group theory and automata theory to produce beautiful pictures.

Kevin Wayne, Corwin Hall, Room 040

  • Research Areas: design, analysis, and implementation of algorithms; data structures; combinatorial optimization; graphs and networks.
  • Design and implement computer visualizations of algorithms or data structures.
  • Develop pedagogical tools or programming assignments for the computer science curriculum at Princeton and beyond.
  • Develop assessment infrastructure and assessments for MOOCs.

Matt Weinberg, 194 Nassau St., Room 222

  • Research Areas: algorithms, algorithmic game theory, mechanism design, game theoretical problems in {Bitcoin, networking, healthcare}.
  • Theoretical questions related to COS 445 topics such as matching theory, voting theory, auction design, etc. 
  • Theoretical questions related to incentives in applications like Bitcoin, the Internet, health care, etc. In a little bit more detail: protocols for these systems are often designed assuming that users will follow them. But often, users will actually be strictly happier to deviate from the intended protocol. How should we reason about user behavior in these protocols? How should we design protocols in these settings?

Huacheng Yu, Room 310

  • data structures
  • streaming algorithms
  • design and analyze data structures / streaming algorithms
  • prove impossibility results (lower bounds)
  • implement and evaluate data structures / streaming algorithms

Ellen Zhong, Room 314

Opportunities outside the department.

We encourage students to look in to doing interdisciplinary computer science research and to work with professors in departments other than computer science.  However, every CS independent work project must have a strong computer science element (even if it has other scientific or artistic elements as well.)  To do a project with an adviser outside of computer science you must have permission of the department.  This can be accomplished by having a second co-adviser within the computer science department or by contacting the independent work supervisor about the project and having he or she sign the independent work proposal form.

Here is a list of professors outside the computer science department who are eager to work with computer science undergraduates.

Maria Apostolaki, Engineering Quadrangle, C330

  • Research areas: Computing & Networking, Data & Information Science, Security & Privacy

Branko Glisic, Engineering Quadrangle, Room E330

  • Documentation of historic structures
  • Cyber physical systems for structural health monitoring
  • Developing virtual and augmented reality applications for documenting structures
  • Applying machine learning techniques to generate 3D models from 2D plans of buildings
  •  Contact : Rebecca Napolitano, rkn2 (@princeton.edu)

Mihir Kshirsagar, Sherrerd Hall, Room 315

Center for Information Technology Policy.

  • Consumer protection
  • Content regulation
  • Competition law
  • Economic development
  • Surveillance and discrimination

Sharad Malik, Engineering Quadrangle, Room B224

Select a Senior Thesis Adviser for the 2020-21 Academic Year.

  • Design of reliable hardware systems
  • Verifying complex software and hardware systems

Prateek Mittal, Engineering Quadrangle, Room B236

  • Internet security and privacy 
  • Social Networks
  • Privacy technologies, anonymous communication
  • Network Science
  • Internet security and privacy: The insecurity of Internet protocols and services threatens the safety of our critical network infrastructure and billions of end users. How can we defend end users as well as our critical network infrastructure from attacks?
  • Trustworthy social systems: Online social networks (OSNs) such as Facebook, Google+, and Twitter have revolutionized the way our society communicates. How can we leverage social connections between users to design the next generation of communication systems?
  • Privacy Technologies: Privacy on the Internet is eroding rapidly, with businesses and governments mining sensitive user information. How can we protect the privacy of our online communications? The Tor project (https://www.torproject.org/) is a potential application of interest.

Ken Norman,  Psychology Dept, PNI 137

  • Research Areas: Memory, the brain and computation 
  • Lab:  Princeton Computational Memory Lab

Potential research topics

  • Methods for decoding cognitive state information from neuroimaging data (fMRI and EEG) 
  • Neural network simulations of learning and memory

Caroline Savage

Office of Sustainability, Phone:(609)258-7513, Email: cs35 (@princeton.edu)

The  Campus as Lab  program supports students using the Princeton campus as a living laboratory to solve sustainability challenges. The Office of Sustainability has created a list of campus as lab research questions, filterable by discipline and topic, on its  website .

An example from Computer Science could include using  TigerEnergy , a platform which provides real-time data on campus energy generation and consumption, to study one of the many energy systems or buildings on campus. Three CS students used TigerEnergy to create a  live energy heatmap of campus .

Other potential projects include:

  • Apply game theory to sustainability challenges
  • Develop a tool to help visualize interactions between complex campus systems, e.g. energy and water use, transportation and storm water runoff, purchasing and waste, etc.
  • How can we learn (in aggregate) about individuals’ waste, energy, transportation, and other behaviors without impinging on privacy?

Janet Vertesi, Sociology Dept, Wallace Hall, Room 122

  • Research areas: Sociology of technology; Human-computer interaction; Ubiquitous computing.
  • Possible projects: At the intersection of computer science and social science, my students have built mixed reality games, produced artistic and interactive installations, and studied mixed human-robot teams, among other projects.

David Wentzlaff, Engineering Quadrangle, Room 228

Computing, Operating Systems, Sustainable Computing.

  • Instrument Princeton's Green (HPCRC) data center
  • Investigate power utilization on an processor core implemented in an FPGA
  • Dismantle and document all of the components in modern electronics. Invent new ways to build computers that can be recycled easier.
  • Other topics in parallel computer architecture or operating systems

Facebook

California State University, San Bernardino

Home > College of Natural Sciences > COMPUTERSCI-ENGINEERING > COMPUTERSCI-ENGINEERING-ETD

Computer Science and Engineering Theses, Projects, and Dissertations

Theses/projects/dissertations from 2024 2024.

TRAFFIC ANALYSIS OF CITIES IN SAN BERNARDINO COUNTY , Sai Kalyan Ayyagari

Recommendation System using machine learning for fertilizer prediction , Durga Rajesh Bommireddy

Classification of Remote Sensing Image Data Using Rsscn-7 Dataset , Satya Priya Challa

Cultural Awareness Application , Bharat Gupta

PREDICTING HOSPITALIZATION USING ARTIFICIAL INTELLIGENCE , Sanath Hiremath

AUTOMATED BRAIN TUMOR CLASSIFIER WITH DEEP LEARNING , venkata sai krishna chaitanya kandula

TRUCK TRAFFIC ANALYSIS IN THE INLAND EMPIRE , Bhavik Khatri

Crash Detecting System Using Deep Learning , Yogesh Reddy Muddam

A SMART HYBRID ENHANCED RECOMMENDATION AND PERSONALIZATION ALGORITHM USING MACHINE LEARNING , Aswin Kumar Nalluri

Theses/Projects/Dissertations from 2023 2023

CLASSIFICATION OF LARGE SCALE FISH DATASET BY DEEP NEURAL NETWORKS , Priyanka Adapa

GEOSPATIAL WILDFIRE RISK PREDICTION USING DEEP LEARNING , Abner Alberto Benavides

HUMAN SUSPICIOUS ACTIVITY DETECTION , Nilamben Bhuva

MAX FIT EVENT MANAGEMENT WITH SALESFORCE , AKSHAY DAGWAR

MELANOMA DETECTION BASED ON DEEP LEARNING NETWORKS , Sanjay Devaraneni

Heart Disease Prediction Using Binary Classification , Virendra Sunil Devare

CLASSIFICATION OF THORAX DISEASES FROM CHEST X-RAY IMAGES , Sharad Jayusukhbhai Dobariya

WEB BASED MANAGEMENT SYSTEM FOR HOUSING SOCIETY , Likhitha Reddy Eddala

Sales and Stock Management System , Rashmika Gaddam Ms

CONTACTLESS FOOD ORDERING SYSTEM , Rishivar Kumar Goli

RESTAURANT MANAGEMENT WEBSITE , Akhil Sai Gollapudi

DISEASE OF LUNG INFECTION DETECTION USING CNN MODEL -BAYESIAN OPTIMIZATION , poojitha gutha

DATA POISONING ATTACKS ON PHASOR MEASUREMENT UNIT DATA , Rutuja Sanjeev Haridas

CRIME MAPPING ANALYSIS USING WEB APPLICATION. , Lavanya Krishnappa

A LONG-TERM FUNDS PREDICTOR BASED ON DEEP LEARNING , SHUIYI KUANG

LIVER SEGMENTATION AND LESION DETECTION IN MEDICAL IMAGES USING A DEEP LEARNING-BASED U-NET MODEL , Kaushik Mahida

PHASOR MEASUREMENT UNIT DATA VISUALIZATION , Nikhila Mandava

TWITTER POLICING , Hemanth Kumar Medisetty

TRANSACTION MANAGEMENT SYSYEM FOR A PUBLISHER , HASSAIN SHAREEF MOHAMMED JR

LOBANGU: AN OPTICAL CHARACTER RECOGNITION RECEIPT MANAGEMENT APP FOR HEALTH CENTER PHARMACIES IN THE D.R.CONGO AND SURROUNDING EASTERN AFRICAN COUNTRIES , Bénis Munganga

PREDICTIVE MODEL FOR CFPB CONSUMER COMPLAINTS , Vyshnavi Nalluri

REVIEW CLASSIFICATION USING NATURAL LANGUAGE PROCESSING AND DEEP LEARNING , Brian Nazareth

Brain Tumor Detection Using MRI Images , Mayur Patel

QUIZ WEB APPLICATION , Dipti Rathod

HYPOTHYROID DISEASE ANALYSIS BY USING MACHINE LEARNING , SANJANA SEELAM

Pillow Based Sleep Tracking Device Using Raspberry Pi , Venkatachalam Seviappan

FINSERV ANDROID APPLICATION , Harsh Piyushkumar Shah

AUTOMATED MEDICAL NOTES LABELLING AND CLASSIFICATION USING MACHINE LEARNING , Akhil Prabhakar Thota

GENETIC PROGRAMMING TO OPTIMIZE PERFORMANCE OF MACHINE LEARNING ALGORITHMS ON UNBALANCED DATA SET , Asitha Thumpati

GOVERNMENT AID PORTAL , Darshan Togadiya

GENERAL POPULATION PROJECTION MODEL WITH CENSUS POPULATION DATA , Takenori Tsuruga

LUNG LESION SEGMENTATION USING DEEP LEARNING APPROACHES , Sree Snigdha Tummala

DETECTION OF PHISHING WEBSITES USING MACHINE LEARNING , Saranya Valleri

Machine Learning for Kalman Filter Tuning Prediction in GPS/INS Trajectory Estimation , Peter Wright

Theses/Projects/Dissertations from 2022 2022

LEARN PROGRAMMING IN VIRTUAL REALITY? A PROJECT FOR COMPUTER SCIENCE STUDENTS , Benjamin Alexander

LUNG CANCER TYPE CLASSIFICATION , Mohit Ramajibhai Ankoliya

HIGH-RISK PREDICTION FOR COVID-19 PATIENTS USING MACHINE LEARNING , Raja Kajuluri

IMPROVING INDIA’S TRAFFIC MANAGEMENT USING INTELLIGENT TRANSPORTATION SYSTEMS , Umesh Makhloga

DETECTION OF EPILEPSY USING MACHINE LEARNING , Balamurugan Murugesan

SOCIAL MOBILE APPLICATION: UDROP , Mahmoud Oraiqat

Improved Sensor-Based Human Activity Recognition Via Hybrid Convolutional and Recurrent Neural Networks , Sonia Perez-Gamboa

College of Education FileMaker Extraction and End-User Database Development , Andrew Tran

DEEP LEARNING EDGE DETECTION IN IMAGE INPAINTING , Zheng Zheng

Theses/Projects/Dissertations from 2021 2021

A General Conversational Chatbot , Vipin Nambiar

Verification System , Paras Nigam

DESKTOP APPLICATION FOR THE PUZZLE BOARD GAME “RUSH HOUR” , Huanqing Nong

Ahmedabad City App , Rushabh Picha

COMPUTER SURVEILLANCE SYSTEM USING WI-FI FOR ANDROID , Shashank Reddy Saireddy

ANDROID PARKING SYSTEM , Vishesh Reddy Sripati

Sentiment Analysis: Stock Index Prediction with Multi-task Learning and Word Polarity Over Time , Yue Zhou

Theses/Projects/Dissertations from 2020 2020

BUBBLE-IN DIGITAL TESTING SYSTEM , Chaz Hampton

FEEDBACK REVIEW SYSTEM USING SENTIMENT ANALYSIS , Vineeth Kukkamalla

WEB APPLICATION FOR MOVIE PERFORMANCE PREDICTION , Devalkumar Patel

Theses/Projects/Dissertations from 2019 2019

REVIEWS TO RATING CONVERSION AND ANALYSIS USING MACHINE LEARNING TECHNIQUES , Charitha Chanamolu

EASY EXAM , SARTHAK DABHI

EXTRACT TRANSFORM AND LOADING TOOL FOR EMAIL , Amit Rajiv Lawanghare

VEHICLE INFORMATION SYSTEM USING BLOCKCHAIN , Amey Zulkanthiwar

Theses/Projects/Dissertations from 2018 2018

USING AUTOENCODER TO REDUCE THE LENGTH OF THE AUTISM DIAGNOSTIC OBSERVATION SCHEDULE (ADOS) , Sara Hussain Daghustani

California State University, San Bernardino Chatbot , Krutarth Desai

ORGANIZE EVENTS MOBILE APPLICATION , Thakshak Mani Chandra Reddy Gudimetla

SOCIAL NETWORK FOR SOFTWARE DEVELOPERS , Sanket Prabhakar Jadhav

VIRTUALIZED CLOUD PLATFORM MANAGEMENT USING A COMBINED NEURAL NETWORK AND WAVELET TRANSFORM STRATEGY , Chunyu Liu

INTER PROCESS COMMUNICATION BETWEEN TWO SERVERS USING MPICH , Nagabhavana Narla

SENSOR-BASED HUMAN ACTIVITY RECOGNITION USING BIDIRECTIONAL LSTM FOR CLOSELY RELATED ACTIVITIES , Arumugam Thendramil Pavai

NEURAL NETWORK ON VIRTUALIZATION SYSTEM, AS A WAY TO MANAGE FAILURE EVENTS OCCURRENCE ON CLOUD COMPUTING , Khoi Minh Pham

EPICCONFIGURATOR COMPUTER CONFIGURATOR AND CMS PLATFORM , IVO A. TANTAMANGO

STUDY ON THE PATTERN RECOGNITION ENHANCEMENT FOR MATRIX FACTORIZATIONS WITH AUTOMATIC RELEVANCE DETERMINATION , hau tao

Theses/Projects/Dissertations from 2017 2017

CHILDREN’S SOCIAL NETWORK: KIDS CLUB , Eiman Alrashoud

MULTI-WAY COMMUNICATION SYSTEM , S. Chinnam

WEB APPLICATION FOR GRADUATE COURSE RECOMMENDATION SYSTEM , Sayali Dhumal

MOBILE APPLICATION FOR ATTENDANCE SYSTEM COYOTE-ATTENDANCE , Sindhu Hari

WEB APPLICATION FOR GRADUATE COURSE ADVISING SYSTEM , Sanjay Karrolla

Custom T-Shirt Designs , Ranjan Khadka

STUDENT CLASS WAITING LIST ENROLLMENT , AISHWARYA LACHAGARI

ANDROID MOBILE APPLICATION FOR HOSPITAL EXECUTIVES , Vihitha Nalagatla

PIPPIN MACHINE , Kiran Reddy Pamulaparthy

SOUND MODE APPLICATION , Sindhuja Pogaku

I2MAPREDUCE: DATA MINING FOR BIG DATA , Vishnu Vardhan Reddy Sherikar

COMPARING AND IMPROVING FACIAL RECOGNITION METHOD , Brandon Luis Sierra

NATURAL LANGUAGE PROCESSING BASED GENERATOR OF TESTING INSTRUMENTS , Qianqian Wang

AUTOMATIC GENERATION OF WEB APPLICATIONS AND MANAGEMENT SYSTEM , Yu Zhou

Theses/Projects/Dissertations from 2016 2016

CLOTH - MODELING, DEFORMATION, AND SIMULATION , Thanh Ho

CoyoteLab - Linux Containers for Educational Use , Michael D. Korcha

PACKET FILTER APPROACH TO DETECT DENIAL OF SERVICE ATTACKS , Essa Yahya M Muharish

DATA MINING: TRACKING SUSPICIOUS LOGGING ACTIVITY USING HADOOP , Bir Apaar Singh Sodhi

Theses/Projects/Dissertations from 2015 2015

APPLY DATA CLUSTERING TO GENE EXPRESSION DATA , Abdullah Jameel Abualhamayl Mr.

Density Based Data Clustering , Rayan Albarakati

Developing Java Programs on Android Mobile Phones Using Speech Recognition , Santhrushna Gande

THE DESIGN AND IMPLEMENTATION OF AN ADAPTIVE CHESS GAME , Mehdi Peiravi

CALIFORNIA STATE UNIVERSITY SAN BERNARDINO WiN GPS , Francisco A. Ron

ESTIMATION ON GIBBS ENTROPY FOR AN ENSEMBLE , Lekhya Sai Sake

Advanced Search

  • Notify me via email or RSS
  • Department, Program, or Office
  • Disciplines

Author Corner

  • School of Computer Science and Engineering Website

A service of the John M. Pfau Library

Digital Commons Network

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright Acrobat Reader

research paper topics about computer engineering

Explore your training options in 10 minutes Get Started

  • Graduate Stories
  • Partner Spotlights
  • Bootcamp Prep
  • Bootcamp Admissions
  • University Bootcamps
  • Coding Tools
  • Software Engineering
  • Web Development
  • Data Science
  • Tech Guides
  • Tech Resources
  • Career Advice
  • Online Learning
  • Internships
  • Apprenticeships
  • Tech Salaries
  • Associate Degree
  • Bachelor's Degree
  • Master's Degree
  • University Admissions
  • Best Schools
  • Certifications
  • Bootcamp Financing
  • Higher Ed Financing
  • Scholarships
  • Financial Aid
  • Best Coding Bootcamps
  • Best Online Bootcamps
  • Best Web Design Bootcamps
  • Best Data Science Bootcamps
  • Best Technology Sales Bootcamps
  • Best Data Analytics Bootcamps
  • Best Cybersecurity Bootcamps
  • Best Digital Marketing Bootcamps
  • Los Angeles
  • San Francisco
  • Browse All Locations
  • Digital Marketing
  • Machine Learning
  • See All Subjects
  • Bootcamps 101
  • Full-Stack Development
  • Career Changes
  • View all Career Discussions
  • Mobile App Development
  • Cybersecurity
  • Product Management
  • UX/UI Design
  • What is a Coding Bootcamp?
  • Are Coding Bootcamps Worth It?
  • How to Choose a Coding Bootcamp
  • Best Online Coding Bootcamps and Courses
  • Best Free Bootcamps and Coding Training
  • Coding Bootcamp vs. Community College
  • Coding Bootcamp vs. Self-Learning
  • Bootcamps vs. Certifications: Compared
  • What Is a Coding Bootcamp Job Guarantee?
  • How to Pay for Coding Bootcamp
  • Ultimate Guide to Coding Bootcamp Loans
  • Best Coding Bootcamp Scholarships and Grants
  • Education Stipends for Coding Bootcamps
  • Get Your Coding Bootcamp Sponsored by Your Employer
  • GI Bill and Coding Bootcamps
  • Tech Intevriews
  • Our Enterprise Solution
  • Connect With Us
  • Publication
  • Reskill America
  • Partner With Us

Career Karma

  • Resource Center
  • Bachelor’s Degree
  • Master’s Degree

The Top 10 Most Interesting Computer Science Research Topics

Computer science touches nearly every area of our lives. With new advancements in technology, the computer science field is constantly evolving, giving rise to new computer science research topics. These topics attempt to answer various computer science research questions and how they affect the tech industry and the larger world.

Computer science research topics can be divided into several categories, such as artificial intelligence, big data and data science, human-computer interaction, security and privacy, and software engineering. If you are a student or researcher looking for computer research paper topics. In that case, this article provides some suggestions on examples of computer science research topics and questions.

Find your bootcamp match

What makes a strong computer science research topic.

A strong computer science topic is clear, well-defined, and easy to understand. It should also reflect the research’s purpose, scope, or aim. In addition, a strong computer science research topic is devoid of abbreviations that are not generally known, though, it can include industry terms that are currently and generally accepted.

Tips for Choosing a Computer Science Research Topic

  • Brainstorm . Brainstorming helps you develop a few different ideas and find the best topic for you. Some core questions you should ask are, What are some open questions in computer science? What do you want to learn more about? What are some current trends in computer science?
  • Choose a sub-field . There are many subfields and career paths in computer science . Before choosing a research topic, ensure that you point out which aspect of computer science the research will focus on. That could be theoretical computer science, contemporary computing culture, or even distributed computing research topics.
  • Aim to answer a question . When you’re choosing a research topic in computer science, you should always have a question in mind that you’d like to answer. That helps you narrow down your research aim to meet specified clear goals.
  • Do a comprehensive literature review . When starting a research project, it is essential to have a clear idea of the topic you plan to study. That involves doing a comprehensive literature review to better understand what has been learned about your topic in the past.
  • Keep the topic simple and clear. The topic should reflect the scope and aim of the research it addresses. It should also be concise and free of ambiguous words. Hence, some researchers recommended that the topic be limited to five to 15 substantive words. It can take the form of a question or a declarative statement.

What’s the Difference Between a Research Topic and a Research Question?

A research topic is the subject matter that a researcher chooses to investigate. You may also refer to it as the title of a research paper. It summarizes the scope of the research and captures the researcher’s approach to the research question. Hence, it may be broad or more specific. For example, a broad topic may read, Data Protection and Blockchain, while a more specific variant can read, Potential Strategies to Privacy Issues on the Blockchain.

On the other hand, a research question is the fundamental starting point for any research project. It typically reflects various real-world problems and, sometimes, theoretical computer science challenges. As such, it must be clear, concise, and answerable.

How to Create Strong Computer Science Research Questions

To create substantial computer science research questions, one must first understand the topic at hand. Furthermore, the research question should generate new knowledge and contribute to the advancement of the field. It could be something that has not been answered before or is only partially answered. It is also essential to consider the feasibility of answering the question.

Top 10 Computer Science Research Paper Topics

1. battery life and energy storage for 5g equipment.

The 5G network is an upcoming cellular network with much higher data rates and capacity than the current 4G network. According to research published in the European Scientific Institute Journal, one of the main concerns with the 5G network is the high energy consumption of the 5G-enabled devices . Hence, this research on this topic can highlight the challenges and proffer unique solutions to make more energy-efficient designs.

2. The Influence of Extraction Methods on Big Data Mining

Data mining has drawn the scientific community’s attention, especially with the explosive rise of big data. Many research results prove that the extraction methods used have a significant effect on the outcome of the data mining process. However, a topic like this analyzes algorithms. It suggests strategies and efficient algorithms that may help understand the challenge or lead the way to find a solution.

3. Integration of 5G with Analytics and Artificial Intelligence

According to the International Finance Corporation, 5G and AI technologies are defining emerging markets and our world. Through different technologies, this research aims to find novel ways to integrate these powerful tools to produce excellent results. Subjects like this often spark great discoveries that pioneer new levels of research and innovation. A breakthrough can influence advanced educational technology, virtual reality, metaverse, and medical imaging.

4. Leveraging Asynchronous FPGAs for Crypto Acceleration

To support the growing cryptocurrency industry, there is a need to create new ways to accelerate transaction processing. This project aims to use asynchronous Field-Programmable Gate Arrays (FPGAs) to accelerate cryptocurrency transaction processing. It explores how various distributed computing technologies can influence mining cryptocurrencies faster with FPGAs and generally enjoy faster transactions.

5. Cyber Security Future Technologies

Cyber security is a trending topic among businesses and individuals, especially as many work teams are going remote. Research like this can stretch the length and breadth of the cyber security and cloud security industries and project innovations depending on the researcher’s preferences. Another angle is to analyze existing or emerging solutions and present discoveries that can aid future research.

6. Exploring the Boundaries Between Art, Media, and Information Technology

The field of computers and media is a vast and complex one that intersects in many ways. They create images or animations using design technology like algorithmic mechanism design, design thinking, design theory, digital fabrication systems, and electronic design automation. This paper aims to define how both fields exist independently and symbiotically.

7. Evolution of Future Wireless Networks Using Cognitive Radio Networks

This research project aims to study how cognitive radio technology can drive evolution in future wireless networks. It will analyze the performance of cognitive radio-based wireless networks in different scenarios and measure its impact on spectral efficiency and network capacity. The research project will involve the development of a simulation model for studying the performance of cognitive radios in different scenarios.

8. The Role of Quantum Computing and Machine Learning in Advancing Medical Predictive Systems

In a paper titled Exploring Quantum Computing Use Cases for Healthcare , experts at IBM highlighted precision medicine and diagnostics to benefit from quantum computing. Using biomedical imaging, machine learning, computational biology, and data-intensive computing systems, researchers can create more accurate disease progression prediction, disease severity classification systems, and 3D Image reconstruction systems vital for treating chronic diseases.

9. Implementing Privacy and Security in Wireless Networks

Wireless networks are prone to attacks, and that has been a big concern for both individual users and organizations. According to the Cyber Security and Infrastructure Security Agency CISA, cyber security specialists are working to find reliable methods of securing wireless networks . This research aims to develop a secure and privacy-preserving communication framework for wireless communication and social networks.

10. Exploring the Challenges and Potentials of Biometric Systems Using Computational Techniques

Much discussion surrounds biometric systems and the potential for misuse and privacy concerns. When exploring how biometric systems can be effectively used, issues such as verification time and cost, hygiene, data bias, and cultural acceptance must be weighed. The paper may take a critical study into the various challenges using computational tools and predict possible solutions.

Other Examples of Computer Science Research Topics & Questions

Computer research topics.

  • The confluence of theoretical computer science, deep learning, computational algorithms, and performance computing
  • Exploring human-computer interactions and the importance of usability in operating systems
  • Predicting the limits of networking and distributed systems
  • Controlling data mining on public systems through third-party applications
  • The impact of green computing on the environment and computational science

Computer Research Questions

  • Why are there so many programming languages?
  • Is there a better way to enhance human-computer interactions in computer-aided learning?
  • How safe is cloud computing, and what are some ways to enhance security?
  • Can computers effectively assist in the sequencing of human genes?
  • How valuable is SCRUM methodology in Agile software development?

Choosing the Right Computer Science Research Topic

Computer science research is a vast field, and it can be challenging to choose the right topic. There are a few things to keep in mind when making this decision. Choose a topic that you are interested in. This will make it easier to stay motivated and produce high-quality research for your computer science degree .

Select a topic that is relevant to your field of study. This will help you to develop specialized knowledge in the area. Choose a topic that has potential for future research. This will ensure that your research is relevant and up-to-date. Typically, coding bootcamps provide a framework that streamlines students’ projects to a specific field, doing their search for a creative solution more effortless.

Computer Science Research Topics FAQ

To start a computer science research project, you should look at what other content is out there. Complete a literature review to know the available findings surrounding your idea. Design your research and ensure that you have the necessary skills and resources to complete the project.

The first step to conducting computer science research is to conceptualize the idea and review existing knowledge about that subject. You will design your research and collect data through surveys or experiments. Analyze your data and build a prototype or graphical model. You will also write a report and present it to a recognized body for review and publication.

You can find computer science research jobs on the job boards of many universities. Many universities have job boards on their websites that list open positions in research and academia. Also, many Slack and GitHub channels for computer scientists provide regular updates on available projects.

There are several hot topics and questions in AI that you can build your research on. Below are some AI research questions you may consider for your research paper.

  • Will it be possible to build artificial emotional intelligence?
  • Will robots replace humans in all difficult cumbersome jobs as part of the progress of civilization?
  • Can artificial intelligence systems self-improve with knowledge from the Internet?

About us: Career Karma is a platform designed to help job seekers find, research, and connect with job training programs to advance their careers. Learn about the CK publication .

What's Next?

icon_10

Get matched with top bootcamps

Ask a question to our community, take our careers quiz.

Saheed Aremu Olanrewaju

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Apply to top tech training programs in one click

Computer Technology Research Paper Topics

Academic Writing Service

This list of computer technology research paper topics  provides the list of 33 potential topics for research papers and an overview article on the history of computer technology.

1. Analog Computers

Paralleling the split between analog and digital computers, in the 1950s the term analog computer was a posteriori projected onto pre-existing classes of mechanical, electrical, and electromechanical computing artifacts, subsuming them under the same category. The concept of analog, like the technical demarcation between analog and digital computer, was absent from the vocabulary of those classifying artifacts for the 1914 Edinburgh Exhibition, the first world’s fair emphasizing computing technology, and this leaves us with an invaluable index of the impressive number of classes of computing artifacts amassed during the few centuries of capitalist modernity. True, from the debate between ‘‘smooth’’ and ‘‘lumpy’’ artificial lines of computing (1910s) to the differentiation between ‘‘continuous’’ and ‘‘cyclic’’ computers (1940s), the subsequent analog–digital split became possible by the multitudinous accumulation of attempts to decontextualize the computer from its socio-historical use alternately to define the ideal computer technically. The fact is, however, that influential classifications of computing technology from the previous decades never provided an encompassing demarcation compared to the analog– digital distinction used since the 1950s. Historians of the digital computer find that the experience of working with software was much closer to art than science, a process that was resistant to mass production; historians of the analog computer find this to have been typical of working with the analog computer throughout all its aspects. The historiography of the progress of digital computing invites us to turn to the software crisis, which perhaps not accidentally, surfaced when the crisis caused by the analog ended. Noticeably, it was not until the process of computing with a digital electronic computer became sufficiently visual by the addition of a special interface—to substitute for the loss of visualization that was previously provided by the analog computer—that the analog computer finally disappeared.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% off with 24start discount code, 2. artificial intelligence.

Artificial intelligence (AI) is the field of software engineering that builds computer systems and occasionally robots to perform tasks that require intelligence. The term ‘‘artificial intelligence’’ was coined by John McCarthy in 1958, then a graduate student at Princeton, at a summer workshop held at Dartmouth in 1956. This two-month workshop marks the official birth of AI, which brought together young researchers who would nurture the field as it grew over the next several decades: Marvin Minsky, Claude Shannon, Arthur Samuel, Ray Solomonoff, Oliver Selfridge, Allen Newell, and Herbert Simon. It would be difficult to argue that the technologies derived from AI research had a profound effect on our way of life by the beginning of the 21st century. However, AI technologies have been successfully applied in many industrial settings, medicine and health care, and video games. Programming techniques developed in AI research were incorporated into more widespread programming practices, such as high-level programming languages and time-sharing operating systems. While AI did not succeed in constructing a computer which displays the general mental capabilities of a typical human, such as the HAL computer in Arthur C. Clarke and Stanley Kubrick’s film 2001: A Space Odyssey, it has produced programs that perform some apparently intelligent tasks, often at a much greater level of skill and reliability than humans. More than this, AI has provided a powerful and defining image of what computer technology might someday be capable of achieving.

3. Computer and Video Games

Interactive computer and video games were first developed in laboratories as the late-night amusements of computer programmers or independent projects of television engineers. Their formats include computer software; networked, multiplayer games on time-shared systems or servers; arcade consoles; home consoles connected to television sets; and handheld game machines. The first experimental projects grew out of early work in computer graphics, artificial intelligence, television technology, hardware and software interface development, computer-aided education, and microelectronics. Important examples were Willy Higinbotham’s oscilloscope-based ‘‘Tennis for Two’’ at the Brookhaven National Laboratory (1958); ‘‘Spacewar!,’’ by Steve Russell, Alan Kotok, J. Martin Graetz and others at the Massachusetts Institute of Technology (1962); Ralph Baer’s television-based tennis game for Sanders Associates (1966); several networked games from the PLATO (Programmed Logic for Automatic Teaching Operations) Project at the University of Illinois during the early 1970s; and ‘‘Adventure,’’ by Will Crowther of Bolt, Beranek & Newman (1972), extended by Don Woods at Stanford University’s Artificial Intelligence Laboratory (1976). The main lines of development during the 1970s and early 1980s were home video consoles, coin-operated arcade games, and computer software.

4. Computer Displays

The display is an essential part of any general-purpose computer. Its function is to act as an output device to communicate data to humans using the highest bandwidth input system that humans possess—the eyes. Much of the development of computer displays has been about trying to get closer to the limits of human visual perception in terms of color and spatial resolution. Mainframe and minicomputers used ‘‘terminals’’ to display the output. These were fed data from the host computer and processed the data to create screen images using a graphics processor. The display was typically integrated with a keyboard system and some communication hardware as a terminal or video display unit (VDU) following the basic model used for teletypes. Personal computers (PCs) in the late 1970s and early 1980s changed this model by integrating the graphics controller into the computer chassis itself. Early PC displays typically displayed only monochrome text and communicated in character codes such as ASCII. Line-scanning frequencies were typically from 15 to 20 kilohertz—similar to television. CRT displays rapidly developed after the introduction of video graphics array (VGA) technology (640 by 480 pixels in16 colors) in the mid-1980s and scan frequencies rose to 60 kilohertz or more for mainstream displays; 100 kilohertz or more for high-end displays. These displays were capable of displaying formats up to 2048 by 1536 pixels with high color depths. Because the human eye is very quick to respond to visual stimulation, developments in display technology have tended to track the development of semiconductor technology that allows the rapid manipulation of the stored image.

5. Computer Memory for Personal Computers

During the second half of the twentieth century, the two primary methods used for the long-term storage of digital information were magnetic and optical recording. These methods were selected primarily on the basis of cost. Compared to core or transistorized random-access memory (RAM), storage costs for magnetic and optical media were several orders of magnitude cheaper per bit of information and were not volatile; that is, the information did not vanish when electrical power was turned off. However, access to information stored on magnetic and optical recorders was much slower compared to RAM memory. As a result, computer designers used a mix of both types of memory to accomplish computational tasks. Designers of magnetic and optical storage systems have sought meanwhile to increase the speed of access to stored information to increase the overall performance of computer systems, since most digital information is stored magnetically or optically for reasons of cost.

6. Computer Modeling

Computer simulation models have transformed the natural, engineering, and social sciences, becoming crucial tools for disciplines as diverse as ecology, epidemiology, economics, urban planning, aerospace engineering, meteorology, and military operations. Computer models help researchers study systems of extreme complexity, predict the behavior of natural phenomena, and examine the effects of human interventions in natural processes. Engineers use models to design everything from jets and nuclear-waste repositories to diapers and golf clubs. Models enable astrophysicists to simulate supernovas, biochemists to replicate protein folding, geologists to predict volcanic eruptions, and physiologists to identify populations at risk of lead poisoning. Clearly, computer models provide a powerful means of solving problems, both theoretical and applied.

7. Computer Networks

Computers and computer networks have changed the way we do almost everything—the way we teach, learn, do research, access or share information, communicate with each other, and even the way we entertain ourselves. A computer network, in simple terms, consists of two or more computing devices (often called nodes) interconnected by means of some medium capable of transmitting data that allows the computers to communicate with each other in order to provide a variety of services to users.

8. Computer Science

Computer science occupies a unique position among the scientific and technical disciplines. It revolves around a specific artifact—the electronic digital computer—that touches upon a broad and diverse set of fields in its design, operation, and application. As a result, computer science represents a synthesis and extension of many different areas of mathematics, science, engineering, and business.

9. Computer-Aided Control Technology

The story of computer-aided control technology is inextricably entwined with the modern history of automation. Automation in the first half of the twentieth century involved (often analog) processes for continuous automatic measurement and control of hardware by hydraulic, mechanical, or electromechanical means. These processes facilitated the development and refinement of battlefield fire-control systems, feedback amplifiers for use in telephony, electrical grid simulators, numerically controlled milling machines, and dozens of other innovations.

10. Computer-Aided Design and Manufacture

Computer-aided design and manufacture, known by the acronym CAD/CAM, is a process for manufacturing mechanical components, wherein computers are used to link the information needed in and produced by the design process to the information needed to control the machine tools that produce the parts. However, CAD/CAM actually constitutes two separate technologies that developed along similar, but unrelated, lines until they were combined in the 1970s.

11. Computer-User Interface

A computer interface is the point of contact between a person and an electronic computer. Today’s interfaces include a keyboard, mouse, and display screen. Computer user interfaces developed through three distinct stages, which can be identified as batch processing, interactive computing, and the graphical user interface (GUI). Today’s graphical interfaces support additional multimedia features, such as streaming audio and video. In GUI design, every new software feature introduces more icons into the process of computer– user interaction. Presently, the large vocabulary of icons used in GUI design is difficult for users to remember, which creates a complexity problem. As GUIs become more complex, interface designers are adding voice recognition and intelligent agent technologies to make computer user interfaces even easier to operate.

12. Early Computer Memory

Mechanisms to store information were present in early mechanical calculating machines, going back to Charles Babbage’s analytical engine proposed in the 1830s. It introduced the concept of the ‘‘store’’ and, if ever built, would have held 1000 numbers of up to 50 decimal digits. However, the move toward base-2 or binary computing in the 1930s brought about a new paradigm in technology—the digital computer, whose most elementary component was an on–off switch. Information on a digital system is represented using a combination of on and off signals, stored as binary digits (shortened to bits): zeros and ones. Text characters, symbols, or numerical values can all be coded as bits, so that information stored in digital memory is just zeros and ones, regardless of the storage medium. The history of computer memory is closely linked to the history of computers but a distinction should be made between primary (or main) and secondary memory. Computers only need operate on one segment of data at a time, and with memory being a scarce resource, the rest of the data set could be stored in less expensive and more abundant secondary memory.

13. Early Digital Computers

Digital computers were a marked departure from the electrical and mechanical calculating and computing machines in wide use from the early twentieth century. The innovation was of information being represented using only two states (on or off), which came to be known as ‘‘digital.’’ Binary (base 2) arithmetic and logic provided the tools for these machines to perform useful functions. George Boole’s binary system of algebra allowed any mathematical equation to be represented by simply true or false logic statements. By using only two states, engineering was also greatly simplified, and universality and accuracy increased. Further developments from the early purpose-built machines, to ones that were programmable accompanied by many key technological developments, resulted in the well-known success and proliferation of the digital computer.

14. Electronic Control Technology

The advancement of electrical engineering in the twentieth century made a fundamental change in control technology. New electronic devices including vacuum tubes (valves) and transistors were used to replace electromechanical elements in conventional controllers and to develop new types of controllers. In these practices, engineers discovered basic principles of control theory that could be further applied to design electronic control systems.

15. Encryption and Code Breaking

The word cryptography comes from the Greek words for ‘‘hidden’’ (kryptos) and ‘‘to write’’ (graphein)—literally, the science of ‘‘hidden writing.’’ In the twentieth century, cryptography became fundamental to information technology (IT) security generally. Before the invention of the digital computer at mid-century, national governments across the world relied on mechanical and electromechanical cryptanalytic devices to protect their own national secrets and communications, as well as to expose enemy secrets. Code breaking played an important role in both World Wars I and II, and the successful exploits of Polish and British cryptographers and signals intelligence experts in breaking the code of the German Enigma ciphering machine (which had a range of possible transformations between a message and its code of approximately 150 trillion (or 150 million million million) are well documented.

16. Error Checking and Correction

In telecommunications, whether transmission of data or voice signals is over copper, fiber-optic, or wireless links, information coded in the signal transmitted must be decoded by the receiver from a background of noise. Signal errors can be introduced, for example from physical defects in the transmission medium (semiconductor crystal defects, dust or scratches on magnetic memory, bubbles in optical fibers), from electromagnetic interference (natural or manmade) or cosmic rays, or from cross-talk (unwanted coupling) between channels. In digital signal transmission, data is transmitted as ‘‘bits’’ (ones or zeros, corresponding to on or off in electronic circuits). Random bit errors occur singly and in no relation to each other. Burst error is a large, sustained error or loss of data, perhaps caused by transmission problems in the connecting cables, or sudden noise. Analog to digital conversion can also introduce sampling errors.

17. Global Positioning System (GPS)

The NAVSTAR (NAVigation System Timing And Ranging) Global Positioning System (GPS) provides an unlimited number of military and civilian users worldwide with continuous, highly accurate data on their position in four dimensions— latitude, longitude, altitude, and time— through all weather conditions. It includes space, control, and user segments (Figure 6). A constellation of 24 satellites in 10,900 nautical miles, nearly circular orbits—six orbital planes, equally spaced 60 degrees apart, inclined approximately 55 degrees relative to the equator, and each with four equidistant satellites—transmits microwave signals in two different L-band frequencies. From any point on earth, between five and eight satellites are ‘‘visible’’ to the user. Synchronized, extremely precise atomic clocks—rubidium and cesium— aboard the satellites render the constellation semiautonomous by alleviating the need to continuously control the satellites from the ground. The control segment consists of a master facility at Schriever Air Force Base, Colorado, and a global network of automated stations. It passively tracks the entire constellation and, via an S-band uplink, periodically sends updated orbital and clock data to each satellite to ensure that navigation signals received by users remain accurate. Finally, GPS users—on land, at sea, in the air or space—rely on commercially produced receivers to convert satellite signals into position, time, and velocity estimates.

18. Gyrocompass and Inertial Guidance

Before the twentieth century, navigation at sea employed two complementary methods, astronomical and dead reckoning. The former involved direct measurements of celestial phenomena to ascertain position, while the latter required continuous monitoring of a ship’s course, speed, and distance run. New navigational technology was required not only for iron ships in which traditional compasses required correction, but for aircraft and submarines in which magnetic compasses cannot be used. Owing to their rapid motion, aircraft presented challenges for near instantaneous navigation data collection and reduction. Electronics furnished the exploitation of radio and the adaptation of a gyroscope to direction finding through the invention of the nonmagnetic gyrocompass.

Although the Cold War arms race after World War II led to the development of inertial navigation, German manufacture of the V-2 rocket under the direction of Wernher von Braun during the war involved a proto-inertial system, a two-gimballed gyro with an integrator to determine speed. Inertial guidance combines a gyrocompass with accelerometers installed along orthogonal axes, devices that record all accelerations of the vehicle in which inertial guidance has been installed. With this system, if the initial position of the vehicle is known, then the vehicle’s position at any moment is known because integrators record all directions and accelerations and calculate speeds and distance run. Inertial guidance devices can subtract accelerations due to gravity or other motions of the vehicle. Because inertial guidance does not depend on an outside reference, it is the ultimate dead reckoning system, ideal for the nuclear submarines for which they were invented and for ballistic missiles. Their self-contained nature makes them resistant to electronic countermeasures. Inertial systems were first installed in commercial aircraft during the 1960s. The expense of manufacturing inertial guidance mechanisms (and their necessary management by computer) has limited their application largely to military and some commercial purposes. Inertial systems accumulate errors, so their use at sea (except for submarines) has been as an adjunct to other navigational methods, unlike aircraft applications. Only the development of the global positioning system (GPS) at the end of the century promised to render all previous navigational technologies obsolete. Nevertheless, a range of technologies, some dating to the beginning of the century, remain in use in a variety of commercial and leisure applications.

19. Hybrid Computers

Following the emergence of the analog–digital demarcation in the late 1940s—and the ensuing battle between a speedy analog versus the accurate digital—the term ‘‘hybrid computer’’ surfaced in the early 1960s. The assumptions held by the adherents of the digital computer—regarding the dynamic mechanization of computational labor to accompany the equally dynamic increase in computational work—was becoming a universal ideology. From this perspective, the digital computer justly appeared to be technically superior. In introducing the digital computer to social realities, however, extensive interaction with the experienced analog computer adherents proved indispensable, especially given that the digital proponents’ expectation of progress by employing the available and inexpensive hardware was stymied by the lack of inexpensive software. From this perspective—as historiographically unwanted it may be by those who agree with the essentialist conception of the analog–digital demarcation—the history of the hybrid computer suggests that the computer as we now know it was brought about by linking the analog and the digital, not by separating them. Placing the ideal analog and the ideal digital at the two poles, all computing techniques that combined some features of both fell beneath ‘‘hybrid computation’’; the designators ‘‘balanced’’ or ‘‘true’’ were preserved for those built with appreciable amounts of both. True hybrids fell into the middle spectrum that included: pure analog computers, analog computers using digital-type numerical analysis techniques, analog computers programmed with the aid of digital computers, analog computers using digital control and logic, analog computers using digital subunits, analog computers using digital computers as peripheral equipment, balanced hybrid computer systems, digital computers using analog subroutines, digital computers with analog arithmetic elements, digital computers designed to permit analog-type programming, digital computers with analog-oriented compilers and interpreters, and pure digital computers.

20. Information Theory

Information theory, also known originally as the mathematical theory of communication, was first explicitly formulated during the mid-twentieth century. Almost immediately it became a foundation; first, for the more systematic design and utilization of numerous telecommunication and information technologies; and second, for resolving a paradox in thermodynamics. Finally, information theory has contributed to new interpretations of a wide range of biological and cultural phenomena, from organic physiology and genetics to cognitive behavior, human language, economics, and political decision making. Reflecting the symbiosis between theory and practice typical of twentieth century technology, technical issues in early telegraphy and telephony gave rise to a proto-information theory developed by Harry Nyquist at Bell Labs in 1924 and Ralph Hartley, also at Bell Labs, in 1928. This theory in turn contributed to advances in telecommunications, which stimulated the development of information theory per se by Claude Shannon and Warren Weaver, in their book The Mathematical Theory of Communication published in 1949. As articulated by Claude Shannon, a Bell Labs researcher, the technical concept of information is defined by the probability of a specific message or signal being picked out from a number of possibilities and transmitted from A to B. Information in this sense is mathematically quantifiable. The amount of information, I, conveyed by signal, S, is inversely related to its probability, P. That is, the more improbable a message, the more information it contains. To facilitate the mathematical analysis of messages, the measure is conveniently defined as I ¼ log2 1/P(S), and is named a binary digit or ‘‘bit’’ for short. Thus in the simplest case of a two-state signal (1 or 0, corresponding to on or off in electronic circuits), with equal probability for each state, the transmission of either state as the code for a message would convey one bit of information. The theory of information opened up by this conceptual analysis has become the basis for constructing and analyzing digital computational devices and a whole range of information technologies (i.e., technologies including telecommunications and data processing), from telephones to computer networks.

21. Internet

The Internet is a global computer network of networks whose origins are found in U.S. military efforts. In response to Sputnik and the emerging space race, the Advanced Research Projects Agency (ARPA) was formed in 1958 as an agency of the Pentagon. The researchers at ARPA were given a generous mandate to develop innovative technologies such as communications.

In 1962, psychologist J.C.R. Licklider from the Massachusetts Institute of Technology’s Lincoln Laboratory joined ARPA to take charge of the Information Processing Techniques Office (IPTO). In 1963 Licklider wrote a memo proposing an interactive network allowing people to communicate via computer. This project did not materialize. In 1966, Bob Taylor, then head of the IPTO, noted that he needed three different computer terminals to connect to three different machines in different locations around the nation. Taylor also recognized that universities working with IPTO needed more computing resources. Instead of the government buying machines for each university, why not share machines? Taylor revitalized Licklider’s idea, securing $1 million in funding, and hired 29-yearold Larry Roberts to direct the creation of ARPAnet.

In 1974, Robert Kahn and Vincent Cerf proposed the first internet-working protocol, a way for datagrams (packets) to be communicated between disparate networks, and they called it an ‘‘internet.’’ Their efforts created transmission control protocol/internet protocol (TCP/IP). In 1982, TCP/IP replaced NCP on ARPAnet. Other networks adopted TCP/IP and it became the dominant standard for all networking by the late 1990s.

In 1981 the U.S. National Science Foundation (NSF) created Computer Science Network (CSNET) to provide universities that did not have access to ARPAnet with their own network. In 1986, the NSF sponsored the NSFNET ‘‘backbone’’ to connect five supercomputing centers. The backbone also connected ARPAnet and CSNET together, and the idea of a network of networks became firmly entrenched. The open technical architecture of the Internet allowed numerous innovations to be grafted easily onto the whole. When ARPAnet was dismantled in 1990, the Internet was thriving at universities and technology- oriented companies. The NSF backbone was dismantled in 1995 when the NSF realized that commercial entities could keep the Internet running and growing on their own, without government subsidy. Commercial network providers worked through the Commercial Internet Exchange to manage network traffic.

22. Mainframe Computers

The term ‘‘computer’’ currently refers to a general-purpose, digital, electronic, stored-program calculating machine. The term ‘‘mainframe’’ refers to a large, expensive, multiuser computer, able to handle a wide range of applications. The term was derived from the main frame or cabinet in which the central processing unit (CPU) and main memory of a computer were kept separate from those cabinets that held peripheral devices used for input and output.

Computers are generally classified as supercomputers, mainframes, minicomputers, or microcomputers. This classification is based on factors such as processing capability, cost, and applications, with supercomputers the fastest and most expensive. All computers were called mainframes until the 1960s, including the first supercomputer, the naval ordnance research calculator (NORC), offered by International Business Machines (IBM) in 1954. In 1960, Digital Equipment Corporation (DEC) shipped the PDP-1, a computer that was much smaller and cheaper than a mainframe.

Mainframes once each filled a large room, cost millions of dollars, and needed a full maintenance staff, partly in order to repair the damage caused by the heat generated by their vacuum tubes. These machines were characterized by proprietary operating systems and connections through dumb terminals that had no local processing capabilities. As personal computers developed and began to approach mainframes in speed and processing power, however, mainframes have evolved to support a client/server relationship, and to interconnect with open standard-based systems. They have become particularly useful for systems that require reliability, security, and centralized control. Their ability to process large amounts of data quickly make them particularly valuable for storage area networks (SANs). Mainframes today contain multiple CPUs, providing additional speed through multiprocessing operations. They support many hundreds of simultaneously executing programs, as well as numerous input and output processors for multiplexing devices, such as video display terminals and disk drives. Many legacy systems, large applications that have been developed, tested, and used over time, are still running on mainframes.

23. Mineral Prospecting

Twentieth century mineral prospecting draws upon the accumulated knowledge of previous exploration and mining activities, advancing technology, expanding knowledge of geologic processes and deposit models, and mining and processing capabilities to determine where and how to look for minerals of interest. Geologic models have been developed for a wide variety of deposit types; the prospector compares geologic characteristics of potential exploration areas with those of deposit models to determine which areas have similar characteristics and are suitable prospecting locations. Mineral prospecting programs are often team efforts, integrating general and site-specific knowledge of geochemistry, geology, geophysics, and remote sensing to ‘‘discover’’ hidden mineral deposits and ‘‘measure’’ their economic potential with increasing accuracy and reduced environmental disturbance. Once a likely target zone has been identified, multiple exploration tools are used in a coordinated program to characterize the deposit and its economic potential.

24. Packet Switching

Historically the first communications networks were telegraphic—the electrical telegraph replacing the mechanical semaphore stations in the mid-nineteenth century. Telegraph networks were largely eclipsed by the advent of the voice (telephone) network, which first appeared in the late nineteenth century, and provided the immediacy of voice conversation. The Public Switched Telephone Network allows a subscriber to dial a connection to another subscriber, with the connection being a series of telephone lines connected together through switches at the telephone exchanges along the route. This technique is known as circuit switching, as a circuit is set up between the subscribers, and is held until the call is cleared.

One of the disadvantages of circuit switching is the fact that the capacity of the link is often significantly underused due to silences in the conversation, but the spare capacity cannot be shared with other traffic. Another disadvantage is the time it takes to establish the connection before the conversation can begin. One could liken this to sending a railway engine from London to Edinburgh to set the points before returning to pick up the carriages. What is required is a compromise between the immediacy of conversation on an established circuit-switched connection, with the ad hoc delivery of a store-and-forward message system. This is what packet switching is designed to provide.

25. Personal Computers

A personal computer, or PC, is designed for personal use. Its central processing unit (CPU) runs single-user systems and application software, processes input from the user, sending output to a variety of peripheral devices. Programs and data are stored in memory and attached storage devices. Personal computers are generally single-user desktop machines, but the term has been applied to any computer that ‘‘stands alone’’ for a single user, including portable computers.

The technology that enabled the construction of personal computers was the microprocessor, a programmable integrated circuit (or ‘‘chip’’) that acts as the CPU. Intel introduced the first microprocessor in 1971, the 4-bit 4004, which it called a ‘‘microprogrammable computer on a chip.’’ The 4004 was originally developed as a general-purpose chip for a programmable calculator, but Intel introduced it as part of Intel’s Microcomputer System 4-bit, or MCS-4, which also included read-only memory (ROM) and random-access memory (RAM) memory chips and a shift register chip. In August 1972, Intel followed with the 8-bit 8008, then the more powerful 8080 in June 1974. Following Intel’s lead, computers based on the 8080 were usually called microcomputers.

The success of the minicomputer during the 1960s prepared computer engineers and users for ‘‘single person, single CPU’’ computers. Digital Equipment Corporation’s (DEC) widely used PDP-10, for example, was smaller, cheaper, and more accessible than large mainframe computers. Timeshared computers operating under operating systems such as TOPS-10 on the PDP-10— co-developed by the Massachusetts Institute of Technology (MIT) and DEC in 1972—created the illusion of individual control of computing power by providing rapid access to personal programs and files. By the early 1970s, the accessibility of minicomputers, advances in microelectronics, and component miniaturization created expectations of affordable personal computers.

26. Printers

Printers generally can be categorized as either impact or nonimpact. Like typewriters, impact printers generate output by striking the page with a solid substance. Impact printers include daisy wheel and dot matrix printers. The daisy wheel printer, which was introduced in 1972 by Diablo Systems, operates by spinning the daisy wheel to the correct character whereupon a hammer strikes it, forcing the character through an inked ribbon and onto the paper. Dot matrix printers operate by using a series of small pins to strike a matrix or grid ribbon coated with ink. The strike of the pin forces the ink to transfer to the paper at the point of impact. Unlike daisy wheel printers, dot matrix printers can generate italic and other character types through producing different pin patterns. Nonimpact printers generate images by spraying or fusing ink to paper or other output media. This category includes inkjet printers, laser printers, and thermal printers. Whether they are inkjet or laser, impact or nonimpact, all modern printers incorporate features of dot matrix technology in their design: they operate by generating dots onto paper or other physical media.

27. Processors for Computers

A processor is the part of the computer system that manipulates the data. The first computer processors of the late 1940s and early 1950s performed three main functions and had three main components. They worked in a cycle to gather, decode, and execute instructions. They were made up of the arithmetic and logic unit, the control unit, and some extra storage components or registers. Today, most processors contain these components and perform these same functions, but since the 1960s they have developed different forms, capabilities, and organization. As with computers in general, increasing speed and decreasing size has marked their development.

28. Radionavigation

Astronomical and dead-reckoning techniques furnished the methods of navigating ships until the twentieth century, when exploitation of radio waves, coupled with electronics, met the needs of aircraft with their fast speeds, but also transformed all navigational techniques. The application of radio to dead reckoning has allowed vessels to determine their positions in all-weather by direction finding (known as radio direction finding, or RDF) or by hyperbolic systems. Another use of radio, radar (radio direction and rangefinding), enables vessels to determine their distance to, or their bearing from, objects of known position. Radionavigation complements traditional navigational methods by employing three frames of reference. First, radio enables a vessel to navigate by lines of bearing to shore transmitters (the most common use of radio). This is directly analogous to the use of lighthouses for bearings. Second, shore stations may take radio bearings of craft and relay to them computed positions. Third, radio beacons provide aircraft or ships with signals that function as true compasses.

29. Software Application Programs

At the beginning of the computer age around the late 1940s, inventors of the intelligent machine were not thinking about applications software, or any software other than that needed to run the bare machine to do mathematical calculating. It was only when Maurice Wilkes’ young protégé David Williams crafted a tidy set of initial orders for the EDSAC, an early programmable digital computer, that users could string together standard subroutines to a program and have the execution jump between them. This was the beginning of software as we know it—something that runs on a machine other than an operating system to make it do anything desired. ‘‘Applications’’ are software other than system programs that run the actual hardware. Manufacturers always had this software, and as the 1950s progressed they would ‘‘bundle’’ applications with hardware to make expensive computers more attractive. Some programming departments were even placed in the marketing departments.

30. Software Engineering

Software engineering aims to develop the programs that allow digital computers to do useful work in a systematic, disciplined manner that produces high-quality software on time and on budget. As computers have spread throughout industrialized societies, software has become a multibillion dollar industry. Both the users and developers of software depend a great deal on the effectiveness of the development process.

Software is a concept that didn’t even pertain to the first electronic digital computers. They were ‘‘programmed’’ through switches and patch cables that physically altered the electrical pathways of the machine. It was not until the Manchester Mark I, the first operational stored-program electronic digital computer, was developed in 1948 at the University of Manchester in England that configuring the machine to solve a specific problem became a matter of software rather than hardware. Subsequently, instructions were stored in memory along with data.

31. Supercomputers

Supercomputers are high-performance computing devices that are generally used for numerical calculation, for the study of physical systems either through numerical simulation or the processing of scientific data. Initially, they were large, expensive, mainframe computers, which were usually owned by government research labs. By the end of the twentieth century, they were more often networks of inexpensive small computers. The common element of all of these machines was their ability to perform high-speed floating-point arithmetic— binary arithmetic that approximates decimal numbers with a fixed number of bits—the basis of numerical computation.

With the advent of inexpensive supercomputers, these machines moved beyond the large government labs and into smaller research and engineering facilities. Some were used for the study of social science. A few were employed by business concerns, such as stock brokerages or graphic designers.

32. Systems Programs

The operating systems used in all computers today are a result of the development and organization of early systems programs designed to control and regulate the operations of computer hardware. The early computing machines such as the ENIAC of 1945 were ‘‘programmed’’ manually with connecting cables and setting switches for each new calculation. With the advent of the stored program computer of the late 1940s (the Manchester Mark I, EDVAC, EDSAC (electronic delay storage automatic calculator), the first system programs such as assemblers and compilers were developed and installed. These programs performed oft repeated and basic operations for computer use including converting programs into machine code, storing and retrieving files, managing computer resources and peripherals, and aiding in the compilation of new programs. With the advent of programming languages, and the dissemination of more computers in research centers, universities, and businesses during the late 1950s and 1960s, a large group of users began developing programs, improving usability, and organizing system programs into operating systems.

The 1970s and 1980s saw a turn away from some of the complications of system software, an interweaving of features from different operating systems, and the development of systems programs for the personal computer. In the early 1970s, two programmers from Bell Laboratories, Ken Thompson and Dennis Ritchie, developed a smaller, simpler operating system called UNIX. Unlike past system software, UNIX was portable and could be run on different computer systems. Due in part to low licensing fees and simplicity of design, UNIX increased in popularity throughout the 1970s. At the Xerox Palo Alto Research Center, research during the 1970s led to the development of system software for the Apple Macintosh computer that included a GUI (graphical user interface). This type of system software filtered the user’s interaction with the computer through the use of graphics or icons representing computer processes. In 1985, a year after the release of the Apple Macintosh computer, a GUI was overlaid on Microsoft’s then dominant operating system, MS-DOS, to produce Microsoft Windows. The Microsoft Windows series of operating systems became and remains the dominant operating system on personal computers.

33. World Wide Web

The World Wide Web (Web) is a ‘‘finite but unbounded’’ collection of media-rich digital resources that are connected through high-speed digital networks. It relies upon an Internet protocol suite that supports cross-platform transmission and makes available a wide variety of media types (i.e., multimedia). The cross-platform delivery environment represents an important departure from more traditional network communications protocols such as e-mail, telnet, and file transfer protocols (FTP) because it is content-centric. It is also to be distinguished from earlier document acquisition systems such as Gopher, which was designed in 1991, originally as a mainframe program but quickly implemented over networks, and wide area information systems (WAIS), also released in 1991. WAIS accommodated a narrower range of media formats and failed to include hyperlinks within their navigation protocols. Following the success of Gopher on the Internet, the Web quickly extended and enriched the metaphor of integrated browsing and navigation. This made it possible to navigate and peruse a wide variety of media types effortlessly on the Web, which in turn led to the Web’s hegemony as an Internet protocol.

History of Computer Technology

Computer Technology

The modern computer—the (electronic) digital computer in which the stored program concept is realized and hence self-modifying programs are possible—was only invented in the 1940s. Nevertheless, the history of computing (interpreted as the usage of modern computers) is only understandable against the background of the many forms of information processing as well as mechanical computing devices that solved mathematical problems in the first half of the twentieth century. The part these several predecessors played in the invention and early history of the computer may be interpreted from two different perspectives: on the one hand it can be argued that these machines prepared the way for the modern digital computer, on the other hand it can be argued that the computer, which was invented as a mathematical instrument, was reconstructed to be a data-processing machine, a control mechanism, and a communication tool.

The invention and early history of the digital computer has its roots in two different kinds of developments: first, information processing in business and government bureaucracies; and second, the use and the search for mathematical instruments and methods that could solve mathematical problems arising in the sciences and in engineering.

Origins in Mechanical Office Equipment

The development of information processing in business and government bureaucracies had its origins in the late nineteenth century, which was not just an era of industrialization and mass production but also a time of continuous growth in administrative work. The economic precondition for this development was the creation of a global economy, which caused growth in production of goods and trade. This brought with it an immense increase in correspondence, as well as monitoring and accounting activities—corporate bureaucracies began to collect and process data in increasing quantities. Almost at the same time, government organizations became more and more interested in collating data on population and demographic changes (e.g., expanding tax revenues, social security, and wide-ranging planning and monitoring functions) and analyzing this data statistically.

Bureaucracies in the U.S. and in Europe reacted in a different way to these changes. While in Europe for the most part neither office machines nor telephones entered offices until 1900, in the U.S. in the last quarter of the nineteenth century the information-handling techniques in bureaucracies were radically changed because of the introduction of mechanical devices for writing, copying, and counting data. The rise of big business in the U.S. had caused a growing demand for management control tools, which was fulfilled by a new ideology of systematic management together with the products of the rising office machines industry. Because of a later start in industrialization, the government and businesses in the U.S. were not forced to reorganize their bureaucracies when they introduced office machines. This, together with an ideological preference for modern office equipment, was the cause of a market for office machines and of a far-reaching mechanization of office work in the U.S. In the 1880s typewriters and cash registers became very widespread, followed by adding machines and book-keeping machines in the 1890s. From 1880 onward, the makers of office machines in the U.S. underwent a period of enormous growth, and in 1920 the office machine industry annually generated about $200 million in revenue. In Europe, by comparison, mechanization of office work emerged about two decades later than in the U.S.—both Germany and Britain adopted the American system of office organization and extensive use of office machines for the most part no earlier than the 1920s.

During the same period the rise of a new office machine technology began. Punched card systems, initially invented by Herman Hollerith to analyze the U.S. census in 1890, were introduced. By 1911 Hollerith’s company had only about 100 customers, but after it had been merged in the same year with two other companies to become the Computing- Tabulating-Recording Company (CTR), it began a tremendous ascent to become the world leader in the office machine industry. CTR’s general manager, Thomas J. Watson, understood the extraordinary potential of these punched-card accounting devices, which enabled their users to process enormous amounts of data largely automatically, in a rapid way and at an adequate level of cost and effort. Due to Watson’s insights and his extraordinary management abilities, the company (which had since been renamed to International Business Machines (IBM)) became the fourth largest office machine supplier in the world by 1928—topped only by Remington Rand, National Cash Register (NCR), and the Burroughs Adding Machine Company.

Origin of Calculating Devices and Analog Instruments

Compared with the fundamental changes in the world of corporate and government bureaucracies caused by office machinery during the late nineteenth and early twentieth century, calculating machines and instruments seemed to have only a minor influence in the world of science and engineering. Scientists and engineers had always been confronted with mathematical problems and had over the centuries developed techniques such as mathematical tables. However, many new mathematical instruments emerged in the nineteenth century and increasingly began to change the world of science and engineering. Apart from the slide rule, which came into popular use in Europe from the early nineteenth century onwards (and became the symbol of the engineer for decades), calculating machines and instruments were only produced on a large scale in the middle of the nineteenth century.

In the 1850s the production of calculating machines as well as that of planimeters (used to measure the area of closed curves, a typical problem in land surveying) started on different scales. Worldwide, less than 2,000 calculating machines were produced before 1880, but more than 10,000 planimeters were produced by the early 1880s. Also, various types of specialized mathematical analog instruments were produced on a very small scale in the late nineteenth century; among them were integraphs for the graphical solution of special types of differential equations, harmonic analyzers for the determination of Fourier coefficients of a periodic function, and tide predictors that could calculate the time and height of the ebb and flood tides.

Nonetheless, in 1900 only geodesists and astronomers (as well as part of the engineering community) made extensive use of mathematical instruments. In addition, the establishment of applied mathematics as a new discipline took place at German universities on a small scale and the use of apparatus and machines as well as graphical and numerical methods began to flourish during this time. After World War I, the development of engineering sciences and of technical physics gave a tremendous boost to applied mathematics in Germany and Britain. In general, scientists and engineers became more aware of the capabilities of calculating machines and a change of the calculating culture—from the use of tables to the use of calculating machines—took place.

One particular problem that was increasingly encountered by mechanical and electrical engineers in the 1920s was the solution of several types of differential equations, which were not solvable by analytic solutions. As one important result of this development, a new type of analog instrument— the so called ‘‘differential analyzer’’—was invented in 1931 by the engineer Vannevar Bush at the Massachusetts Institute of Technology (MIT). In contrast to its predecessors—several types of integraphs—this machine (which was later called an analog computer) could be used not only to solve a special class of differential equation, but a more general class of differential equations associated with engineering problems. Before the digital computer was invented in the 1940s there was an intensive use of analog instruments (similar to Bush’s differential analyzer) and a number of machines were constructed in the U.S. and in Europe after the model of Bush’s machine before and during World War II. Analog instruments also became increasingly important in several fields such as the firing control of artillery on warships or the control of rockets. It is worth mentioning here that only for a limited class of scientific and engineering problems was it possible to construct an analog computer— weather forecasting and the problem of shock waves produced by an atomic bomb, for example, required the solution of partial differential equations, for which a digital computer was needed.

The Invention of the Computer

The invention of the electronic digital stored-program computer is directly connected with the development of numerical calculation tools for the solution of mathematical problems in the sciences and in engineering. The ideas that led to the invention of the computer were developed simultaneously by scientists and engineers in Germany, Britain, and the U.S. in the 1930s and 1940s. The first freely programmable program-controlled automatic calculator was developed by the civil engineering student Konrad Zuse in Germany. Zuse started development work on program-controlled computing machines in the 1930s, when he had to deal with extensive calculations in static, and in 1941 his Z3, which was based on electromechanical relay technology, became operational.

Several similar developments in the U.S. were in progress at the same time. In 1937 Howard Aiken, a physics student at Harvard University, approached IBM to build a program-controlled calculator— later called the ‘‘Harvard Mark I.’’ On the basis of a concept Aiken had developed because of his experiences with the numerical solution of partial differential equations, the machine was built and became operational in 1944. At almost the same time a series of important relay computers was built at the Bell Laboratories in New York following a suggestion by George R. Stibitz. All these developments in the U.S. were spurred by the outbreak of World War II. The first large-scale programmable electronic computer called the Colossus was built in complete secrecy in 1943 to 1944 at Bletchley Park in Britain in order to help break the German Enigma machine ciphers.

However, it was neither these relay calculators nor the Colossus that were decisive for the development of the universal computer, but the ENIAC (electronic numerical integrator and computer), which was developed at the Moore School of Engineering at the University of Pennsylvania. Extensive ballistic calculations were carried out there for the U.S. Army during World War II with the aid of the Bush ‘‘differential analyzer’’ and more than a hundred women (‘‘computors’’) working on mechanical desk calculators. Observing that capacity was barely sufficient to compute the artillery firing tables, the physicist John W. Mauchly and the electronic engineer John Presper Eckert started developing the ENIAC, a digital version of the differential analyzer, in 1943 with funding from the U.S. Army.

In 1944 the mathematician John von Neumann turned his attention to the ENIAC because of his mathematical work on the Manhattan Project (on the implosion of the hydrogen bomb). While the ENIAC was being built, Neumann and the ENIAC team drew up plans for a successor to the ENIAC in order to improve the shortcomings of the ENIAC concept, such as the very small memory and the time-consuming reprogramming (actually rewiring) required to change the setup for a new calculation. In these meetings the idea of a stored-program, universal machine evolved. Memory was to be used to store the program in addition to data. This would enable the machine to execute conditional branches and change the flow of the program. The concept of a computer in the modern sense of the word was born and in 1945 von Neumann wrote the important ‘‘First Draft of a Report on the EDVAC,’’ which described the stored-program, universal computer. The logical structure that was presented in this draft report is now referred to as the ‘‘von Neumann architecture.’’ This EDVAC report was originally intended for internal use but once made freely available it became the ‘‘bible’’ for computer pioneers throughout the world in the 1940s and 1950s. The first computer featuring the von Neumann architecture operated at Cambridge University in the U.K.; in June 1949 the EDSAC (electronic delay storage automatic computer) computer built by Maurice Wilkes—designed according to the EDVAC principles—became operational.

The Computer as a Scientific Instrument

As soon as the computer was invented, a growing demand for computers by scientists and engineers evolved, and numerous American and European universities started their own computer projects in the 1940s and 1950s. After the technical difficulties of building an electronic computer were solved, scientists grasped the opportunity to use the new scientific instrument for their research. For example, at the University of Gottingen in Germany, the early computers were used for the initial value problems of partial differential equations associated with hydrodynamic problems from atomic physics and aerodynamics. Another striking example was the application of von Neumann’s computer at the Institute for Advanced Study (IAS) in Princeton to numerical weather forecasts in 1950. As a result, numerical weather forecasts could be made on a regular basis from the mid-1950s onwards.

Mathematical methods have always been of a certain importance for science and engineering sciences, but only the use of the electronic digital computer (as an enabling technology) made it possible to broaden the application of mathematical methods to such a degree that research in science, medicine, and engineering without computer- based mathematical methods has become virtually inconceivable at the end of the twentieth century. A number of additional computer-based techniques, such as scientific visualization, medical imaging, computerized tomography, pattern recognition, image processing, and statistical applications, have become of the utmost significance for science, medicine, engineering, and social sciences. In addition, the computer changed the way engineers construct technical artifacts fundamentally because of the use of computer-based methods such as computer-aided design (CAD), computer-aided manufacture (CAM), computer-aided engineering, control applications, and finite-element methods. However, the most striking example seems to be the development of scientific computing and computer modeling, which became accepted as a third mode of scientific research that complements experimentation and theoretical analysis. Scientific computing and computer modeling are based on supercomputers as the enabling technology, which became important tools for modern science routinely used to simulate physical and chemical phenomena. These high-speed computers became equated with the machines developed by Seymour Cray, who built the fastest computers in the world for many years. The supercomputers he launched such as the legendary CRAY I from 1976 were the basis for computer modeling of real world systems, and helped, for example, the defense industry in the U.S. to build weapons systems and the oil industry to create geological models that show potential oil deposits.

Growth of Digital Computers in Business and Information Processing

When the digital computer was invented as a mathematical instrument in the 1940s, it could not have been foreseen that this new artifact would ever be of a certain importance in the business world. About 50 firms entered the computer business worldwide in the late 1940s and the early 1950s, and the computer was reconstructed to be a type of electronic data-processing machine that took the place of punched-card technology as well as other office machine technology. It is interesting to consider that there were mainly three types of companies building computers in the 1950s and 1960s: newly created computer firms (such as the company founded by the ENIAC inventors Eckert and Mauchly), electronics and control equipments firms (such as RCA and General Electric), and office appliance companies (such as Burroughs and NCR). Despite the fact that the first digital computers were put on the market by a German and a British company, U.S. firms dominated the world market from the 1950s onward, as these firms had the biggest market as well as financial support from the government.

Generally speaking, the Cold War exerted an enormous influence on the development of computer technology. Until the early 1960s the U.S. military and the defense industry were the central drivers of the digital computer expansion, serving as the main market for computer technology and shaping and speeding up the formation of the rising computer industry. Because of the U.S. military’s role as the ‘‘tester’’ for prototype hard- and software, it had a direct and lasting influence on technological developments; in addition, it has to be noted that the spread of computer technology was partly hindered by military secrecy. Even after the emergence of a large civilian computer market in the 1960s, the U.S. military maintained its influence by investing a great deal in computer in hard- and software and in computer research projects.

From the middle of the 1950s onwards the world computer market was dominated by IBM, which accounted for more than 70 percent of the computer industry revenues until the mid-1970s. The reasons for IBM’s overwhelming success were diverse, but the company had a unique combination of technical and organizational capabilities at its disposal that prepared it perfectly for the mainframe computer market. In addition, IBM benefited from enormous government contracts, which helped to develop excellence in computer technology and design. However, the greatest advantage of IBM was by no doubt its marketing organization and its reputation as a service-oriented firm, which was used to working closely with customers to adapt machinery to address specific problems, and this key difference between IBM and its competitors persisted right into the computer age.

During the late 1950s and early 1960s, the computer market—consisting of IBM and seven other companies called the ‘‘seven dwarves’’—was dominated by IBM, with its 650 and 1401 computers. By 1960 the market for computers was still small. Only about 7,000 computers had been delivered by the computer industry, and at this time even IBM was primarily a punched-card machine supplier, which was still the major source of its income. Only in 1960 did a boom in demand for computers start, and by 1970 the number of computers installed worldwide had increased to more than 100,000. The computer industry was on the track to become one of the world’s major industries, and was totally dominated by IBM.

The outstanding computer system of this period was IBM’s System/360. It was announced in 1964 as a compatible family of the same computer architecture, and employed interchangeable peripheral devices in order to solve IBM’s problems with a hotchpotch of incompatible product lines (which had evoked large problems in the development and maintenance of a great deal of different hardware and software products). Despite the fact that neither the technology used nor the systems programming were of a high-tech technology at the time, the System/360 established a new standard for mainframe computers for decades. Various computer firms in the U.S., Europe, Japan and even Russia, concentrated on copying components, peripherals for System/360 or tried to build System/360-compatible computers.

The growth of the computer market during the 1960s was accompanied by market shakeouts: two of the ‘‘seven dwarves’’ left the computer business after the first computer recession in the early 1970s, and afterwards the computer market was controlled by IBM and BUNCH (Burroughs, UNIVAC, NCR, Control Data, and Honeywell). At the same time, an internationalization of the computer market took place—U.S. companies controlled the world market for computers— which caused considerable fears over loss of national independence in European and Japanese national governments, and these subsequently stirred up national computing programs. While the European attempts to create national champions as well as the more general attempt to create a European-wide market for mainframe computers failed in the end, Japan’s attempt to found a national computer industry has been successful: Until today Japan is the only nation able to compete with the U.S. in a wide array of high-tech computer-related products.

Real-Time and Time-Sharing

Until the 1960s almost all computers in government and business were running batch-processing applications (i.e., the computers were only used in the same way as the punched-card accounting machines they had replaced). In the early 1950s, however, the computer industry introduced a new mode of computing named ‘‘real-time’’ in the business sector for the first time, which was originally developed for military purposes in MIT’s Whirlwind project. This project was initially started in World War II with the aim of designing an aircraft simulator by analog methods, and later became a part of a research and development program for the gigantic, computerized anti-aircraft defense system SAGE (semi-automatic ground environment) built up by IBM in the 1950s.

The demand for this new mode of computing was created by cultural and structural changes in economy. The increasing number of financial transactions in banks and insurance companies as well as increasing airline traveling activities made necessary new computer-based information systems that led finally to new forms of business evolution through information technology.

The case of the first computerized airline reservation system SABRE, developed for American Airlines by IBM in the 1950s and finally implemented in the early 1960s, serves to thoroughly illustrate these structural and structural changes in economy. Until the early 1950s, airline reservations had been made manually without any problems, but by 1953 this system was in crisis because increased air traffic and growing flight plan complexity had made reservation costs insupportable. SABRE became a complete success, demonstrating the potential of centralized real-time computing systems connected via a network. The system enabled flight agents throughout the U.S., who were equipped with desktop terminals, to gain a direct, real-time access to the central reservation system based on central IBM mainframe computers, while the airline was able to assign appropriate resources in response. Therefore, an effective combination of advantages was offered by SABRE—a better utilization of resources and a much higher customer convenience.

Very soon this new mode of computing spread around the business and government world and became commonplace throughout the service and distribution sectors of the economy; for example, bank tellers and insurance account representatives increasingly worked at terminals. On the one hand structural information problems led managers to go this way, and on the other hand the increasing use of computers as information handling machines in government and business had brought about the idea of computer-based accessible data retrieval. In the end, more and more IBM customers wanted to link dozens of operators directly to central computers by using terminal keyboards and display screens.

In the late 1950s and early 1960s—at the same time that IBM and American Airlines had begun the development of the SABRE airline reservation system—a group of brilliant computer scientists had a new idea for computer usage named ‘‘time sharing.’’ Instead of dedicating a multi-terminal system solely to a single application, they had the computer utility vision of organizing a mainframe computer so that several users could interact with it simultaneously. This vision was to change the nature of computing profoundly, because computing was no longer provided to naive users by programmers and systems analysts, and by the late 1960s time-sharing computers became widespread in the U.S.

Particularly important for this development had been the work of J.C.R. Licklider of the Advanced Research Project Agency (ARPA) of the U.S. Department of Defense. In 1960 Licklider had published a now-classic paper ‘‘Man–Computer Symbiosis’’ proposing the use of computers to augment human intellect and creating the vision of interactive computing. Licklider was very successful in translating his idea of a network allowing people on different computers to communicate into action, and convinced ARPA to start an enormous research program in 1962. Its budget surpassed that of all other sources of U.S. public research funding for computers combined. The ARPA research programs resulted in a series of fundamental moves forward in computer technology in areas such as computer graphics, artificial intelligence, and operating systems. For example, even the most influential current operating system, the general-purpose time-sharing system Unix, developed in the early 1970s at the Bell Laboratories, was a spin-off of an ambitious operating system project, Multics, funded by ARPA. The designers of Unix successfully attempted to keep away from complexity by using a clear, minimalist design approach to software design, and created a multitasking, multiuser operating system, which became the standard operating system in the 1980s.

Electronic Component Revolution

While the nature of business computing was changed by the new paradigms such as real time and time sharing, advances in solid-state components increasingly became a driving force for fundamental changes in the computer industry, and led to a dynamic interplay between new computer designs and new programming techniques that resulted in a remarkable series of technical developments. The technical progress of the mainframe computer had always run parallel to conversions in the electronics components. During the period from 1945 to 1965, two fundamental transformations in the electronics industry took place that were marked by the invention of the transistor in 1947 and the integrated circuit in 1957 to 1958. While the first generation of computers—lasting until about 1960—was characterized by vacuum tubes (valves) for switching elements, the second generation used the much smaller and more reliable transistors, which could be produced at a lower price. A new phase was inaugurated when an entire integrated circuit on a chip of silicon was produced in 1961, and when the first integrated circuits were produced for the military in 1962. A remarkable pace of progress in semiconductor innovations, known as the ‘‘revolution in miniature,’’ began to speed up the computer industry. The third generation of computers characterized by the use of integrated circuits began with the announcement of the IBM System/360 in 1964 (although this computer system did not use true integrated circuits). The most important effect of the introduction of integrated circuits was not to strengthen the leading mainframe computer systems, but to destroy Grosch’s Law, which stated that computing power increases as the square of its costs. In fact, the cost of computer power dramatically reduced during the next ten years.

This became clear with the introduction of the first computer to use integrated circuits on a full scale in 1965: the Digital Equipment Corporation (DEC) offered its PDP-8 computer for just $18,000, creating a new class of computers called minicomputers—small in size and low in cost—as well as opening up the market to new customers. Minicomputers were mainly used in areas other than general-purpose computing such as industrial applications and interactive graphics systems. The PDP-8 became the first widely successful minicomputer with over 50,000 items sold, demonstrating that there was a market for smaller computers. This success of DEC (by 1970 it had become the world’s third largest computer manufacturer) was supported by dramatic advances in solid-state technology. During the 1960s the number of transistors on a chip doubled every two years, and as a result minicomputers became continuously more powerful and more inexpensive at an inconceivable speed.

Personal Computing

The most striking aspect of the consequences of the exponential increase of the number of transistors on a chip during the 1960s—as stated by ‘‘Moore’s Law’’: the number of transistors on a chip doubled every two years—was not the lowering of the costs of mainframe computer and minicomputer processing and storage, but the introduction of the first consumer products based on chip technology such as hand-held calculators and digital watches in about 1970. More specifically, the market acts in these industries were changed overnight by the shift from mechanical to chip technology, which led to an enormous deterioration in prices as well as a dramatic industry shakeout. These episodes only marked the beginning of wide-ranging changes in economy and society during the last quarter of the twentieth century leading to a new situation where chips played an essential role in almost every part of business and modern life.

The case of the invention of the personal computer serves to illustrate that it was not sufficient to develop the microprocessor as the enabling technology in order to create a new invention, but how much new technologies can be socially constructed by cultural factors and commercial interests. When the microprocessor, a single-chip integrated circuit implementation of a CPU, was launched by the semiconductor company Intel in 1971, there was no hindrance to producing a reasonably priced microcomputer, but it took six years until the consumer product PC emerged. None of the traditional mainframe and minicomputer companies were involved in creating the early personal computer. Instead, a group of computer hobbyists as well as the ‘‘computer liberation’’ movement in the U.S. became the driving force behind the invention of the PC. These two groups were desperately keen on a low-priced type of minicomputer for use at home for leisure activities such as computer games; or rather they had the counterculture vision of an unreservedly available and personal access to an inexpensive computer utility provided with rich information. When in 1975 the Altair 8800, an Intel 8080 microprocessor-based computer, was offered as an electronic hobbyist kit for less than $400, these two groups began to realize their vision of a ‘‘personal computer.’’ Very soon dozens of computer clubs and computer magazines were founded around the U.S., and these computer enthusiasts created the personal computer by combining the Altair with keyboards, disk drives, and monitors as well as by developing standard software for it. Consequently, in only two years, a more or less useless hobbyist kit had been changed into a computer that could easily be transformed in a consumer product.

The computer hobbyist period ended in 1977, when the first standard machines for an emerging consumer product mass market were sold. These included products such as the Commodore Pet and the Apple II, which included its own monitor, disk drive, and keyboard, and was provided with several basic software packages. Over next three years, spreadsheet, word processing, and database software were developed, and an immense market for games software evolved. As a result, personal computers became more and more a consumer product for ordinary people, and Apple’s revenues shot to more than $500 million in 1982. By 1980, the personal computer had transformed into a business machine, and IBM decided to develop its own personal computer, which was introduced as the IBM PC in 1981. It became an overwhelming success and set a new industry standard.

Apple tried to compete by launching their new Macintosh computer in 1984 provided with a revolutionary graphical user interface (GUI), which set a new standard for a user-friendly human–computer interaction. It was based on technology created by computer scientists at the Xerox Palo Alto Research Center in California, who had picked up on ideas about human– computer interaction developed at the Stanford Research Institute and at the University of Utah. Despite the fact that the Macintosh’s GUI was far superior to the MS-DOS operating system of the IBM-compatible PCs, Apple failed to win the business market and remained a niche player with a market share of about 10 percent. The PC main branch was determined by the companies IBM had chosen as its original suppliers in 1981 for the design of the microprocessor (Intel) and the operating system (Microsoft). While IBM failed to seize power in the operating system software market for PCs in a software war with Microsoft, Microsoft achieved dominance not only of the key market for PC operating systems, but also the key market of office applications during the first half of the 1990s.

In the early 1990s computing again underwent further fundamental changes with the appearance of the Internet, and for the most computer users, networking became an integral part of what it means to have a computer. Furthermore, the rise of the Internet indicated the impending arrival of a new ‘‘information infrastructure’’ as well as of a ‘‘digital convergence,’’ as the coupling of computers and communications networks was often called.

In addition, the 1990s were a period of an information technology boom, which was mainly based on the Internet hype. For many years previously, it seemed to a great deal of managers and journalists that the Internet would become not just an indispensable business tool, but also a miracle cure for economic growth and prosperity. In addition, computer scientists and sociologists started a discussion predicting the beginning of a new ‘‘information age’’ based on the Internet as a ‘‘technological revolution’’ and reshaping the ‘‘material basis’’ of industrial societies.

The Internet was the outcome of an unusual collaboration of a military–industrial–academic complex that promoted the development of this extraordinary innovation. It grew out of a military network called the ARPAnet, a project established and funded by ARPA in the 1960s. The ARPAnet was initially devoted to support of data communications for defense research projects and was only used by a small number of researchers in the 1970s. Its further development was primarily promoted by unintentional forms of network usage. The users of the ARPAnet became very much attracted by the opportunity for communicating through electronic mail, which rapidly surpassed all other forms of network activities. Another unplanned spin-off of the ARPAnet was the Usenet (Unix User Network), which started in 1979 as a link between two universities and enabled its users to subscribe to newsgroups. Electronic mail became a driving force for the creation of a large number of new proprietary networks funded by the existing computer services industry or by organizations such as the NSF (NSFnet). Because networks users’ desire for email to be able to cross network boundaries, an ARPA project on ‘‘internetworking’’ became the origin for the ‘‘Internet’’—a network of networks linked by several layers of protocols such as TCP/IP (transmission control protocol/internet protocol), which quickly developed into the actual standard.

Only after the government funding had solved many of the most essential technical issues and had shaped a number of the most characteristic features of the Internet, did private sector entrepreneurs start Internet-related ventures and quickly developed user-oriented enhancements. Nevertheless, the Internet did not make a promising start and it took more than ten years before significant numbers of networks were connected. In 1980, the Internet had less than two hundred hosts, and during the next four years the number of hosts went up only to 1000. Only when the Internet reached the educational and business community of PC users in the late 1980s, did it start to become an important economic and social phenomenon. The number of hosts began an explosive growth in the late 1980s—by 1988 there were over 50,000 hosts. An important and unforeseen side effect of this development became the creation of the Internet into a new electronic publishing medium. The electronic publishing development that excited most interest in the Internet was the World Wide Web, originally developed at the CERN High Energy Physics Laboratory in Geneva in 1989. Soon there were millions of documents on the Internet, and private PC users became excited by the joys of surfing the Internet. A number of firms such as AOL soon provided low-cost network access and a range of consumer-oriented information services. The Internet boom was also helped by the Clinton–Gore presidential election campaign on the ‘‘information superhighway’’ and by the amazing news reporting on the national information infrastructure in the early 1990s. Nevertheless, for many observers it was astounding how fast the number of hosts on the Internet increased during the next few years—from more than 1 million in 1992 to 72 million in 1999.

The overwhelming success of the PC and of the Internet tends to hide the fact that its arrival marked only a branching in computer history and not a sequence. (Take, for example, the case of mainframe computers, which still continue to run, being of great importance to government facilities and the private sector (such as banks and insurance companies), or the case of supercomputers, being of the utmost significance for modern science and engineering.) Furthermore, it should be noted that only a small part of the computer applications performed today is easily observable—98 percent of programmable CPUs are used in embedded systems such as automobiles, medical devices, washing machines and mobile telephones.

Browse other Technology Research Paper Topics .

ORDER HIGH QUALITY CUSTOM PAPER

research paper topics about computer engineering

computer science Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

Hiring CS Graduates: What We Learned from Employers

Computer science ( CS ) majors are in high demand and account for a large part of national computer and information technology job market applicants. Employment in this sector is projected to grow 12% between 2018 and 2028, which is faster than the average of all other occupations. Published data are available on traditional non-computer science-specific hiring processes. However, the hiring process for CS majors may be different. It is critical to have up-to-date information on questions such as “what positions are in high demand for CS majors?,” “what is a typical hiring process?,” and “what do employers say they look for when hiring CS graduates?” This article discusses the analysis of a survey of 218 recruiters hiring CS graduates in the United States. We used Atlas.ti to analyze qualitative survey data and report the results on what positions are in the highest demand, the hiring process, and the resume review process. Our study revealed that a software developer was the most common job the recruiters were looking to fill. We found that the hiring process steps for CS graduates are generally aligned with traditional hiring steps, with an additional emphasis on technical and coding tests. Recruiters reported that their hiring choices were based on reviewing resume’s experience, GPA, and projects sections. The results provide insights into the hiring process, decision making, resume analysis, and some discrepancies between current undergraduate CS program outcomes and employers’ expectations.

A Systematic Literature Review of Empiricism and Norms of Reporting in Computing Education Research Literature

Context. Computing Education Research (CER) is critical to help the computing education community and policy makers support the increasing population of students who need to learn computing skills for future careers. For a community to systematically advance knowledge about a topic, the members must be able to understand published work thoroughly enough to perform replications, conduct meta-analyses, and build theories. There is a need to understand whether published research allows the CER community to systematically advance knowledge and build theories. Objectives. The goal of this study is to characterize the reporting of empiricism in Computing Education Research literature by identifying whether publications include content necessary for researchers to perform replications, meta-analyses, and theory building. We answer three research questions related to this goal: (RQ1) What percentage of papers in CER venues have some form of empirical evaluation? (RQ2) Of the papers that have empirical evaluation, what are the characteristics of the empirical evaluation? (RQ3) Of the papers that have empirical evaluation, do they follow norms (both for inclusion and for labeling of information needed for replication, meta-analysis, and, eventually, theory-building) for reporting empirical work? Methods. We conducted a systematic literature review of the 2014 and 2015 proceedings or issues of five CER venues: Technical Symposium on Computer Science Education (SIGCSE TS), International Symposium on Computing Education Research (ICER), Conference on Innovation and Technology in Computer Science Education (ITiCSE), ACM Transactions on Computing Education (TOCE), and Computer Science Education (CSE). We developed and applied the CER Empiricism Assessment Rubric to the 427 papers accepted and published at these venues over 2014 and 2015. Two people evaluated each paper using the Base Rubric for characterizing the paper. An individual person applied the other rubrics to characterize the norms of reporting, as appropriate for the paper type. Any discrepancies or questions were discussed between multiple reviewers to resolve. Results. We found that over 80% of papers accepted across all five venues had some form of empirical evaluation. Quantitative evaluation methods were the most frequently reported. Papers most frequently reported results on interventions around pedagogical techniques, curriculum, community, or tools. There was a split in papers that had some type of comparison between an intervention and some other dataset or baseline. Most papers reported related work, following the expectations for doing so in the SIGCSE and CER community. However, many papers were lacking properly reported research objectives, goals, research questions, or hypotheses; description of participants; study design; data collection; and threats to validity. These results align with prior surveys of the CER literature. Conclusions. CER authors are contributing empirical results to the literature; however, not all norms for reporting are met. We encourage authors to provide clear, labeled details about their work so readers can use the study methodologies and results for replications and meta-analyses. As our community grows, our reporting of CER should mature to help establish computing education theory to support the next generation of computing learners.

Light Diacritic Restoration to Disambiguate Homographs in Modern Arabic Texts

Diacritic restoration (also known as diacritization or vowelization) is the process of inserting the correct diacritical markings into a text. Modern Arabic is typically written without diacritics, e.g., newspapers. This lack of diacritical markings often causes ambiguity, and though natives are adept at resolving, there are times they may fail. Diacritic restoration is a classical problem in computer science. Still, as most of the works tackle the full (heavy) diacritization of text, we, however, are interested in diacritizing the text using a fewer number of diacritics. Studies have shown that a fully diacritized text is visually displeasing and slows down the reading. This article proposes a system to diacritize homographs using the least number of diacritics, thus the name “light.” There is a large class of words that fall under the homograph category, and we will be dealing with the class of words that share the spelling but not the meaning. With fewer diacritics, we do not expect any effect on reading speed, while eye strain is reduced. The system contains morphological analyzer and context similarities. The morphological analyzer is used to generate all word candidates for diacritics. Then, through a statistical approach and context similarities, we resolve the homographs. Experimentally, the system shows very promising results, and our best accuracy is 85.6%.

A genre-based analysis of questions and comments in Q&A sessions after conference paper presentations in computer science

Gender diversity in computer science at a large public r1 research university: reporting on a self-study.

With the number of jobs in computer occupations on the rise, there is a greater need for computer science (CS) graduates than ever. At the same time, most CS departments across the country are only seeing 25–30% of women students in their classes, meaning that we are failing to draw interest from a large portion of the population. In this work, we explore the gender gap in CS at Rutgers University–New Brunswick, a large public R1 research university, using three data sets that span thousands of students across six academic years. Specifically, we combine these data sets to study the gender gaps in four core CS courses and explore the correlation of several factors with retention and the impact of these factors on changes to the gender gap as students proceed through the CS courses toward completing the CS major. For example, we find that a significant percentage of women students taking the introductory CS1 course for majors do not intend to major in CS, which may be a contributing factor to a large increase in the gender gap immediately after CS1. This finding implies that part of the retention task is attracting these women students to further explore the major. Results from our study include both novel findings and findings that are consistent with known challenges for increasing gender diversity in CS. In both cases, we provide extensive quantitative data in support of the findings.

Designing for Student-Directedness: How K–12 Teachers Utilize Peers to Support Projects

Student-directed projects—projects in which students have individual control over what they create and how to create it—are a promising practice for supporting the development of conceptual understanding and personal interest in K–12 computer science classrooms. In this article, we explore a central (and perhaps counterintuitive) design principle identified by a group of K–12 computer science teachers who support student-directed projects in their classrooms: in order for students to develop their own ideas and determine how to pursue them, students must have opportunities to engage with other students’ work. In this qualitative study, we investigated the instructional practices of 25 K–12 teachers using a series of in-depth, semi-structured interviews to develop understandings of how they used peer work to support student-directed projects in their classrooms. Teachers described supporting their students in navigating three stages of project development: generating ideas, pursuing ideas, and presenting ideas. For each of these three stages, teachers considered multiple factors to encourage engagement with peer work in their classrooms, including the quality and completeness of shared work and the modes of interaction with the work. We discuss how this pedagogical approach offers students new relationships to their own learning, to their peers, and to their teachers and communicates important messages to students about their own competence and agency, potentially contributing to aims within computer science for broadening participation.

Creativity in CS1: A Literature Review

Computer science is a fast-growing field in today’s digitized age, and working in this industry often requires creativity and innovative thought. An issue within computer science education, however, is that large introductory programming courses often involve little opportunity for creative thinking within coursework. The undergraduate introductory programming course (CS1) is notorious for its poor student performance and retention rates across multiple institutions. Integrating opportunities for creative thinking may help combat this issue by adding a personal touch to course content, which could allow beginner CS students to better relate to the abstract world of programming. Research on the role of creativity in computer science education (CSE) is an interesting area with a lot of room for exploration due to the complexity of the phenomenon of creativity as well as the CSE research field being fairly new compared to some other education fields where this topic has been more closely explored. To contribute to this area of research, this article provides a literature review exploring the concept of creativity as relevant to computer science education and CS1 in particular. Based on the review of the literature, we conclude creativity is an essential component to computer science, and the type of creativity that computer science requires is in fact, a teachable skill through the use of various tools and strategies. These strategies include the integration of open-ended assignments, large collaborative projects, learning by teaching, multimedia projects, small creative computational exercises, game development projects, digitally produced art, robotics, digital story-telling, music manipulation, and project-based learning. Research on each of these strategies and their effects on student experiences within CS1 is discussed in this review. Last, six main components of creativity-enhancing activities are identified based on the studies about incorporating creativity into CS1. These components are as follows: Collaboration, Relevance, Autonomy, Ownership, Hands-On Learning, and Visual Feedback. The purpose of this article is to contribute to computer science educators’ understanding of how creativity is best understood in the context of computer science education and explore practical applications of creativity theory in CS1 classrooms. This is an important collection of information for restructuring aspects of future introductory programming courses in creative, innovative ways that benefit student learning.

CATS: Customizable Abstractive Topic-based Summarization

Neural sequence-to-sequence models are the state-of-the-art approach used in abstractive summarization of textual documents, useful for producing condensed versions of source text narratives without being restricted to using only words from the original text. Despite the advances in abstractive summarization, custom generation of summaries (e.g., towards a user’s preference) remains unexplored. In this article, we present CATS, an abstractive neural summarization model that summarizes content in a sequence-to-sequence fashion while also introducing a new mechanism to control the underlying latent topic distribution of the produced summaries. We empirically illustrate the efficacy of our model in producing customized summaries and present findings that facilitate the design of such systems. We use the well-known CNN/DailyMail dataset to evaluate our model. Furthermore, we present a transfer-learning method and demonstrate the effectiveness of our approach in a low resource setting, i.e., abstractive summarization of meetings minutes, where combining the main available meetings’ transcripts datasets, AMI and International Computer Science Institute(ICSI) , results in merely a few hundred training documents.

Exploring students’ and lecturers’ views on collaboration and cooperation in computer science courses - a qualitative analysis

Factors affecting student educational choices regarding oer material in computer science, export citation format, share document.

University Library, University of Illinois at Urbana-Champaign

University of Illinois Library Wordmark

Electrical and Computer Engineering Research Resources: Find Articles & Papers

  • Find Articles & Papers
  • High-Impact Journals
  • Standards & Technical Reports
  • Patents & Government Documents
  • E-Books & Reference
  • Dissertations & Theses
  • Additional Resources

Engineering Easy Search

University library search engines.

  • Grainger Engineering Library Homepage With specialized searches for Engineering and the Physical Sciences.
  • Easy Search The easiest way to locate University Library resources, materials, and more!
  • Find Online Journals Search by title or by subject to view our subscription details, including date ranges and where you can access full text.
  • Journal and Article Locator Finds electronic or print copy of articles by using a citation.

Engineering Article Databases

  • Engineering Village This link opens in a new window Search for articles, conference paper, and report information in all areas of engineering. Full-text is often available through direct download.
  • Scopus This link opens in a new window Search periodicals, conference proceedings, technical reports, trade literature, patents, books, and press releases in all engineering fields. Some full-text available as direct downloads.
  • Web of Science (Core Collection) This link opens in a new window Search for articles in science and engineering. Also provides Science Citation Index that tracks citations in science and technical journals published since 1981. Journal Citation Reports are also available through ISI.

Electrical & Computer Engineering

  • ACM Digital Library This site provides access to tables of contents, abstracts, reviews, and full text of every article ever published by ACM and bibliographic citations from major publishers in computing.
  • ENGnetBASE A collection of best-selling engineering handbooks and reference titles. Includes access to sub-collections: CivilENGINEERINGnetBASE, ElectricalENGINEERINGnetBASE, GeneralENGINEERINGnetBASE, IndustrialENGINEERINGnetBASE, MechanicalENGINEERINGnetBASE, MiningENGINEERINGnetBASE.
  • IEEE Xplore Provides full-text access to IEEE transactions, IEEE and IEE journals, magazines, and conference proceedings published since 1988, and all current IEEE standards; brings additional search and access features to IEEE/IEE digital library users. Browsable by books & e-books, conference publications, education and learning, journals and magazines, standards and by topic. Also provides links to IEEE standards, IEEE spectrum and other sites.
  • INSPEC Database providing access to bibliographic citations and abstracts of the scientific and technical literature in physics, electrical engineering, electronics, communications, control engineering, computers and computing, information technology, manufacturing and production engineering. Material covered includes journal articles, conference proceedings, reports, dissertations, patents and books published around the world.
  • Microelectronics Packaging Materials Database (MPMD) The MPMD database contains data and information on thermal, mechanical, electrical and physical properties of electronics packaging materials. Available in a Web-based format. The database is continually updated and expanded.
  • SPIE Digital Library Contains full-text papers on optics and photonics from SPIE journals and proceedings published since 1990. Approximately 15,000 new papers will be added each year.

Subject Guide

Profile Photo

Ask a Librarian

  • Next: High-Impact Journals >>
  • Last Updated: Jun 16, 2023 9:35 AM
  • URL: https://guides.library.illinois.edu/ece

ScholarWorks@UMass Amherst

Home > Engineering > ECE > Electrical & Computer Engineering Masters Theses Collection

Electrical and Computer Engineering

Electrical & Computer Engineering Masters Theses Collection

Theses from 2024 2024.

Extracting DNN Architectures Via Runtime Profiling On Mobile GPUs , Dong Hyub Kim, Electrical & Computer Engineering

Semantic-Aware Blockchain Architecture Design for Lifelong Edge-enabled Metaverse , Ning Wang, Electrical & Computer Engineering

Blockchain Design for a Secure Pharmaceutical Supply Chain , Zhe Xu, Electrical & Computer Engineering

Collaborative Caching and Computation Offloading for Intelligent Transportation Systems enabled by Satellite-Airborne-Terrestrial Networks , Shulun Yang, Electrical & Computer Engineering

Protecting Return Address Integrity for RISC-V via Pointer Authentication , yuhe zhao, Electrical & Computer Engineering

Theses from 2023 2023

Fingerprinting for Chiplet Architectures Using Power Distribution Network Transients , Matthew G. Burke, Electrical & Computer Engineering

Design and Fabrication of a Trapped Ion Quantum Computing Testbed , Christopher A. Caron, Electrical & Computer Engineering

Analog Cancellation of a Known Remote Interference: Hardware Realization and Analysis , James M. Doty, Electrical & Computer Engineering

Electrothermal Properties of 2D Materials in Device Applications , Samantha L. Klein, Electrical & Computer Engineering

Ablation Study on Deeplabv3+ for Semantic Segmentation , Bowen Lei, Electrical & Computer Engineering

A Composability-Based Transformer Pruning Framework , Yuping Lin, Electrical & Computer Engineering

A Model Extraction Attack on Deep Neural Networks Running on GPUs , Jonah G. O'Brien Weiss, Electrical & Computer Engineering

Heterogeneous IoT Network Architecture Design for Age of Information Minimization , Xiaohao Xia, Electrical & Computer Engineering

Theses from 2022 2022

Theory and Analysis of Backprojection Processing for Interferometric SAR , Marc Closa Tarres, Electrical & Computer Engineering

Unpaired Skeleton-to-Photo Translation for Sketch-to-Photo Synthesis , Yuanzhe Gu, Electrical & Computer Engineering

Integration of Digital Signal Processing Block in SymbiFlow FPGA Toolchain for Artix-7 Devices , Andrew T. Hartnett, Electrical & Computer Engineering

Planar Ultra-Wideband Modular Antenna (PUMA) Arrays for High-Volume Manufacturing on Organic Laminates and BGA Interfaces , James R. LaCroix, Electrical & Computer Engineering

Planar Transmission-Line Metamaterials on an Irregular Grid , Tina E. Maurer, Electrical & Computer Engineering

Formally Verifiable Synthesis Flow In FPGAs , Anurag V. Muttur, Electrical & Computer Engineering

Theses from 2021 2021

Graph-Algorithm Based Verification on Network Configuration Robustness , Zibin Chen, Electrical & Computer Engineering

A Cloud Infrastructure for Large Scale Health Monitoring in Older Adult Care Facilities , Uchechukwu Gabriel David, Electrical & Computer Engineering

Internet Infrastructures for Large Scale Emulation with Efficient HW/SW Co-design , Aiden K. Gula, Electrical & Computer Engineering

Mtemp: An Ambient Temperature Estimation Method Using Acoustic Signal on Mobile Devices , Hao Guo, Electrical & Computer Engineering

BENCHMARKING SMALL-DATASET STRUCTURE-ACTIVITY-RELATIONSHIP MODELS FOR PREDICTION OF WNT SIGNALING INHIBITION , Mahtab Kokabi, Electrical & Computer Engineering

ACTION : Adaptive Cache Block Migration in Distributed Cache Architectures , Chandra Sekhar Mummidi, Electrical & Computer Engineering

Modeling and Characterization of Optical Metasurfaces , Mahsa Torfeh, Electrical & Computer Engineering

TickNet: A Lightweight Deep Classifier for Tick Recognition , Li Wang, Electrical & Computer Engineering

Lecture Video Transformation through An Intelligent Analysis and Post-processing System , Xi Wang, Electrical & Computer Engineering

Correcting For Terrain Interference, Attenuation, and System Bias for a Dual Polarimetric, X-Band Radar , Casey Wolsieffer, Electrical & Computer Engineering

Theses from 2020 2020

Numerical Simulation of Thermoelectric Transport in Bulk and Nanostructured SiSn Alloys , Venkatakrishna Dusetty, Electrical & Computer Engineering

Deep Reinforcement Learning For Distributed Fog Network Probing , Xiaoding Guan, Electrical & Computer Engineering

COMPRESSIVE PARAMETER ESTIMATION VIA APPROXIMATE MESSAGE PASSING , Shermin Hamzehei, Electrical & Computer Engineering

Metric Learning via Linear Embeddings for Human Motion Recognition , ByoungDoo Kong, Electrical & Computer Engineering

Compound Effects of Clock and Voltage Based Power Side-Channel Countermeasures , Jacqueline Lagasse, Electrical & Computer Engineering

Network Virtualization and Emulation using Docker, OpenvSwitch and Mininet-based Link Emulation , Narendra Prabhu, Electrical & Computer Engineering

Thermal Transport Modeling Of Semiconductor Materials From First Principles , Aliya Qureshi, Electrical & Computer Engineering

CROSSTALK BASED SIDE CHANNEL ATTACKS IN FPGAs , Chethan Ramesh, Electrical & Computer Engineering

Accelerating RSA Public Key Cryptography via Hardware Acceleration , Pavithra Ramesh, Electrical & Computer Engineering

Real-Time TDDFT-Based Filtered Spectroscopy , Ivan Williams, Electrical & Computer Engineering

Perception System: Object and Landmark Detection for Visually Impaired Users , Chenguang Zhang, Electrical & Computer Engineering

Theses from 2019 2019

An Empirical Analysis of Network Traffic: Device Profiling and Classification , Mythili Vishalini Anbazhagan, Electrical & Computer Engineering

Pre-Travel Training And Real-Time Guidance System For People With Disabilities In Indoor Environments , Binru Cao, Electrical & Computer Engineering

Energy Efficiency of Computation in All-spin Logic: Projections and Fundamental Limits , Zongya Chen, Electrical & Computer Engineering

Improving Resilience of Communication in Information Dissemination for Time-Critical Applications , Rajvardhan Somraj Deshmukh, Electrical & Computer Engineering

InSAR Simulations for SWOT and Dual Frequency Processing for Topographic Measurements , Gerard Masalias Huguet, Electrical & Computer Engineering

A Study on Controlling Power Supply Ramp-Up Time in SRAM PUFs , Harshavardhan Ramanna, Electrical & Computer Engineering

The UMass Experimental X-Band Radar (UMAXX): An Upgrade of the CASA MA-1 to Support Cross-Polarization Measurements , Jezabel Vilardell Sanchez, Electrical & Computer Engineering

A Video-Based System for Emergency Preparedness and Recovery , Juechen Yin, Electrical & Computer Engineering

Theses from 2018 2018

Phonon Transport at Boundaries and Interfaces in Two-Dimensional Materials , Cameron Foss, Electrical & Computer Engineering

SkinnySensor: Enabling Battery-Less Wearable Sensors Via Intrabody Power Transfer , Neev Kiran, Electrical & Computer Engineering

Immersive Pre-travel Training Application for Seniors and People with Disabilities , Yang Li, Electrical & Computer Engineering

Analog Computing using 1T1R Crossbar Arrays , Yunning Li, Electrical & Computer Engineering

On-Chip Communication and Security in FPGAs , Shivukumar Basanagouda Patil, Electrical & Computer Engineering

CROWDSOURCING BASED MICRO NAVIGATION SYSTEM FOR VISUALLY IMPAIRED , Quan Shi, Electrical & Computer Engineering

AN EVALUATION OF SDN AND NFV SUPPORT FOR PARALLEL, ALTERNATIVE PROTOCOL STACK OPERATIONS IN FUTURE INTERNETS , Bhushan Suresh, Electrical & Computer Engineering

Applications Of Physical Unclonable Functions on ASICS and FPGAs , Mohammad Usmani, Electrical & Computer Engineering

Improvements to the UMASS S-Band FM-CW Vertical Wind Profiling Radar: System Performance and Data Analysis. , Joseph Waldinger, Electrical & Computer Engineering

Theses from 2017 2017

AutoPlug: An Automated Metadata Service for Smart Outlets , Lurdh Pradeep Reddy Ambati, Electrical & Computer Engineering

SkyNet: Memristor-based 3D IC for Artificial Neural Networks , Sachin Bhat, Electrical & Computer Engineering

Navigation Instruction Validation Tool and Indoor Wayfinding Training System for People with Disabilities , Linlin Ding, Electrical & Computer Engineering

Energy Efficient Loop Unrolling for Low-Cost FPGAs , Naveen Kumar Dumpala, Electrical & Computer Engineering

Effective Denial of Service Attack on Congestion Aware Adaptive Network on Chip , Vijaya Deepak Kadirvel, Electrical & Computer Engineering

VIRTUALIZATION OF CLOSED-LOOP SENSOR NETWORKS , Priyanka Dattatri Kedalagudde, Electrical & Computer Engineering

The Impact of Quantum Size Effects on Thermoelectric Performance in Semiconductor Nanostructures , Adithya Kommini, Electrical & Computer Engineering

MAGNETO-ELECTRIC APPROXIMATE COMPUTATIONAL FRAMEWORK FOR BAYESIAN INFERENCE , Sourabh Kulkarni, Electrical & Computer Engineering

Time Domain SAR Processing with GPUs for Airborne Platforms , Dustin Lagoy, Electrical & Computer Engineering

Query on Knowledge Graphs with Hierarchical Relationships , Kaihua Liu, Electrical & Computer Engineering

HIGH PERFORMANCE SILVER DIFFUSIVE MEMRISTORS FOR FUTURE COMPUTING , Rivu Midya, Electrical & Computer Engineering

Achieving Perfect Location Privacy in Wireless Devices Using Anonymization , Zarrin Montazeri, Electrical & Computer Engineering

KaSI: a Ka-band and S-band Cross-track Interferometer , Gerard Ruiz Carregal, Electrical & Computer Engineering

Analyzing Spark Performance on Spot Instances , Jiannan Tian, Electrical & Computer Engineering

Indoor Navigation For The Blind And Visually Impaired: Validation And Training Methodology Using Virtual Reality , Sili Wang, Electrical & Computer Engineering

Efficient Scaling of a Web Proxy Cluster , Hao Zhang, Electrical & Computer Engineering

ORACLE GUIDED INCREMENTAL SAT SOLVING TO REVERSE ENGINEER CAMOUFLAGED CIRCUITS , Xiangyu Zhang, Electrical & Computer Engineering

Theses from 2016 2016

Seamless Application Delivery Using Software Defined Exchanges , Divyashri Bhat, Electrical & Computer Engineering

PROCESSOR TEMPERATURE AND RELIABILITY ESTIMATION USING ACTIVITY COUNTERS , Mayank Chhablani, Electrical & Computer Engineering

PARQ: A MEMORY-EFFICIENT APPROACH FOR QUERY-LEVEL PARALLELISM , Qianqian Gao, Electrical & Computer Engineering

Accelerated Iterative Algorithms with Asynchronous Accumulative Updates on a Heterogeneous Cluster , Sandesh Gubbi Virupaksha, Electrical & Computer Engineering

Improving Efficiency of Thermoelectric Devices Made of Si-Ge, Si-Sn, Ge-Sn, and Si-Ge-Sn Binary and Ternary Alloys , Seyedeh Nazanin Khatami, Electrical & Computer Engineering

6:1 PUMA Arrays: Designs and Finite Array Effects , Michael Lee, Electrical & Computer Engineering

Protecting Controllers against Denial-of-Service Attacks in Software-Defined Networks , Jingrui Li, Electrical & Computer Engineering

INFRASTRUCTURE-FREE SECURE PAIRING OF MOBILE DEVICES , Chunqiu Liu, Electrical & Computer Engineering

Extrinsic Effects on Heat and Electron Transport In Two-Dimensional Van-Der Waals Materials- A Boltzmann Transport Study , Arnab K. Majee, Electrical & Computer Engineering

SpotLight: An Information Service for the Cloud , Xue Ouyang, Electrical & Computer Engineering

Localization, Visualization And Evacuation Guidance System In Emergency Situations , Jingyan Tang, Electrical & Computer Engineering

Variation Aware Placement for Efficient Key Generation using Physically Unclonable Functions in Reconfigurable Systems , Shrikant S. Vyas, Electrical & Computer Engineering

EVALUATING FEATURES FOR BROAD SPECIES BASED CLASSIFICATION OF BIRD OBSERVATIONS USING DUAL-POLARIZED DOPPLER WEATHER RADAR , Sheila Werth, Electrical & Computer Engineering

Theses from 2015 2015

Quality Factor of Horizontal Wire Dipole Antennas near Planar Conductor or Dielectric Interface , Adebayo Gabriel Adeyemi, Electrical & Computer Engineering

Evaluation of Two-Dimensional Codes for Digital Information Security in Physical Documents , Shuai Chen, Electrical & Computer Engineering

Design and Implementation of a High Performance Network Processor with Dynamic Workload Management , Padmaja Duggisetty, Electrical & Computer Engineering

Wavelet-Based Non-Homogeneous Hidden Markov Chain Model For Hyperspectral Signature Classification , Siwei Feng, Electrical & Computer Engineering

DEVELOPMENT OF INFRARED AND TERAHERTZ BOLOMETERS BASED ON PALLADIUM AND CARBON NANOTUBES USING ROLL TO ROLL PROCESS , Amulya Gullapalli, Electrical & Computer Engineering

Development of Prototypes of a Portable Road Weather Information System , Meha Kainth, Electrical & Computer Engineering

ADACORE: Achieving Energy Efficiency via Adaptive Core Morphing at Runtime , Nithesh Kurella, Electrical & Computer Engineering

Architecting SkyBridge-CMOS , Mingyu Li, Electrical & Computer Engineering

Function Verification of Combinational Arithmetic Circuits , Duo Liu, Electrical & Computer Engineering

ENERGY EFFICIENCY EXPLORATION OF COARSE-GRAIN RECONFIGURABLE ARCHITECTURE WITH EMERGING NONVOLATILE MEMORY , Xiaobin Liu, Electrical & Computer Engineering

Development of a Layout-Level Hardware Obfuscation Tool to Counter Reverse Engineering , Shweta Malik, Electrical & Computer Engineering

Energy Agile Cluster Communication , Muhammad Zain Mustafa, Electrical & Computer Engineering

Architecting NP-Dynamic Skybridge , Jiajun Shi, Electrical & Computer Engineering

Advanced Search

  • Notify me via email or RSS
  • Collections
  • Disciplines

Author Corner

  • Login for Faculty Authors
  • Faculty Author Gallery
  • Expert Gallery
  • University Libraries
  • Electrical and Computer Engineering Website
  • UMass Amherst

This page is sponsored by the University Libraries.

© 2009 University of Massachusetts Amherst • Site Policies

Privacy Copyright

banner-in1

  • Programming

Latest Computer Science Research Topics for 2024

Home Blog Programming Latest Computer Science Research Topics for 2024

Play icon

Everybody sees a dream—aspiring to become a doctor, astronaut, or anything that fits your imagination. If you were someone who had a keen interest in looking for answers and knowing the “why” behind things, you might be a good fit for research. Further, if this interest revolved around computers and tech, you would be an excellent computer researcher!

As a tech enthusiast, you must know how technology is making our life easy and comfortable. With a single click, Google can get you answers to your silliest query or let you know the best restaurants around you. Do you know what generates that answer? Want to learn about the science going on behind these gadgets and the internet?

For this, you will have to do a bit of research. Here we will learn about top computer science thesis topics and computer science thesis ideas.

Top 12 Computer Science Research Topics for 2024 

Before starting with the research, knowing the trendy research paper ideas for computer science exploration is important. It is not so easy to get your hands on the best research topics for computer science; spend some time and read about the following mind-boggling ideas before selecting one.

1. Integrated Blockchain and Edge Computing Systems: A Survey, Some Research Issues, and Challenges

Integrated Blockchain and Edge Computing Systems

Welcome to the era of seamless connectivity and unparalleled efficiency! Blockchain and edge computing are two cutting-edge technologies that have the potential to revolutionize numerous sectors. Blockchain is a distributed ledger technology that is decentralized and offers a safe and transparent method of storing and transferring data.

As a young researcher, you can pave the way for a more secure, efficient, and scalable architecture that integrates blockchain and edge computing systems. So, let's roll up our sleeves and get ready to push the boundaries of technology with this exciting innovation!

Blockchain helps to reduce latency and boost speed. Edge computing, on the other hand, entails processing data close to the generation source, such as sensors and IoT devices. Integrating edge computing with blockchain technologies can help to achieve safer, more effective, and scalable architecture.

Moreover, this research title for computer science might open doors of opportunities for you in the financial sector.

2. A Survey on Edge Computing Systems and Tools

Edge Computing Systems and Tools

With the rise in population, the data is multiplying by manifolds each day. It's high time we find efficient technology to store it. However, more research is required for the same.

Say hello to the future of computing with edge computing! The edge computing system can store vast amounts of data to retrieve in the future. It also provides fast access to information in need. It maintains computing resources from the cloud and data centers while processing.

Edge computing systems bring processing power closer to the data source, resulting in faster and more efficient computing. But what tools are available to help us harness the power of edge computing?

As a part of this research, you will look at the newest edge computing tools and technologies to see how they can improve your computing experience. Here are some of the tools you might get familiar with upon completion of this research:

  • Apache NiFi:  A framework for data processing that enables users to gather, transform, and transfer data from edge devices to cloud computing infrastructure.
  • Microsoft Azure IoT Edge: A platform in the cloud that enables the creation and deployment of cutting-edge intelligent applications.
  • OpenFog Consortium:  An organization that supports the advancement of fog computing technologies and architectures is the OpenFog Consortium.

3. Machine Learning: Algorithms, Real-world Applications, and Research Directions

Machine learning is the superset of Artificial Intelligence; a ground-breaking technology used to train machines to mimic human action and work. ML is used in everything from virtual assistants to self-driving cars and is revolutionizing the way we interact with computers. But what is machine learning exactly, and what are some of its practical uses and future research directions?

To find answers to such questions, it can be a wonderful choice to pick from the pool of various computer science dissertation ideas.

You will discover how computers learn several actions without explicit programming and see how they perform beyond their current capabilities. However, to understand better, having some basic programming knowledge always helps. KnowledgeHut’s Programming course for beginners will help you learn the most in-demand programming languages and technologies with hands-on projects.

During the research, you will work on and study

  • Algorithm: Machine learning includes many algorithms, from decision trees to neural networks.
  • Applications in the Real-world: You can see the usage of ML in many places. It can early detect and diagnose diseases like cancer. It can detect fraud when you are making payments. You can also use it for personalized advertising.
  • Research Trend:  The most recent developments in machine learning research, include explainable AI, reinforcement learning, and federated learning.

While a single research paper is not enough to bring the light on an entire domain as vast as machine learning; it can help you witness how applicable it is in numerous fields, like engineering, data science & analysis, business intelligence, and many more.

Whether you are a data scientist with years of experience or a curious tech enthusiast, machine learning is an intriguing and vital field that's influencing the direction of technology. So why not dig deeper?

4. Evolutionary Algorithms and their Applications to Engineering Problems

Evolutionary Algorithms

Imagine a system that can solve most of your complex queries. Are you interested to know how these systems work? It is because of some algorithms. But what are they, and how do they work? Evolutionary algorithms use genetic operators like mutation and crossover to build new generations of solutions rather than starting from scratch.

This research topic can be a choice of interest for someone who wants to learn more about algorithms and their vitality in engineering.

Evolutionary algorithms are transforming the way we approach engineering challenges by allowing us to explore enormous solution areas and optimize complex systems.

The possibilities are infinite as long as this technology is developed further. Get ready to explore the fascinating world of evolutionary algorithms and their applications in addressing engineering issues.

5. The Role of Big Data Analytics in the Industrial Internet of Things

Role of Big Data Analytics in the Industrial Internet of Things

Datasets can have answers to most of your questions. With good research and approach, analyzing this data can bring magical results. Welcome to the world of data-driven insights! Big Data Analytics is the transformative process of extracting valuable knowledge and patterns from vast and complex datasets, boosting innovation and informed decision-making.

This field allows you to transform the enormous amounts of data produced by IoT devices into insightful knowledge that has the potential to change how large-scale industries work. It's like having a crystal ball that can foretell.

Big data analytics is being utilized to address some of the most critical issues, from supply chain optimization to predictive maintenance. Using it, you can find patterns, spot abnormalities, and make data-driven decisions that increase effectiveness and lower costs for several industrial operations by analyzing data from sensors and other IoT devices.

The area is so vast that you'll need proper research to use and interpret all this information. Choose this as your computer research topic to discover big data analytics' most compelling applications and benefits. You will see that a significant portion of industrial IoT technology demands the study of interconnected systems, and there's nothing more suitable than extensive data analysis.

6. An Efficient Lightweight Integrated Blockchain (ELIB) Model for IoT Security and Privacy

Are you concerned about the security and privacy of your Internet of Things (IoT) devices? As more and more devices become connected, it is more important than ever to protect the security and privacy of data. If you are interested in cyber security and want to find new ways of strengthening it, this is the field for you.

ELIB is a cutting-edge solution that offers private and secure communication between IoT devices by fusing the strength of blockchain with lightweight cryptography. This architecture stores encrypted data on a distributed ledger so only parties with permission can access it.

But why is ELIB so practical and portable? ELIB uses lightweight cryptography to provide quick and effective communication between devices, unlike conventional blockchain models that need complicated and resource-intensive computations.

Due to its increasing vitality, it is gaining popularity as a research topic as someone aware that this framework works and helps reinstate data security is highly demanded in financial and banking.

7. Natural Language Processing Techniques to Reveal Human-Computer Interaction for Development Research Topics

Welcome to the world where machines decode the beauty of the human language. With natural language processing (NLP) techniques, we can analyze the interactions between humans and computers to reveal valuable insights for development research topics. It is also one of the most crucial PhD topics in computer science as NLP-based applications are gaining more and more traction.

Etymologically, natural language processing (NLP) is a potential technique that enables us to examine and comprehend natural language data, such as discussions between people and machines. Insights on user behaviour, preferences, and pain areas can be gleaned from these encounters utilizing NLP approaches.

But which specific areas should we leverage on using NLP methods? This is precisely what you’ll discover while doing this computer science research.

Gear up to learn more about the fascinating field of NLP and how it can change how we design and interact with technology, whether you are a UX designer, a data scientist, or just a curious tech lover and linguist.

8. All One Needs to Know About Fog Computing and Related Edge Computing Paradigms: A Complete Survey

If you are an IoT expert or a keen lover of the Internet of Things, you should leap and move forward to discovering Fog Computing. With the rise of connected devices and the Internet of Things (IoT), traditional cloud computing models are no longer enough. That's where fog computing and related edge computing paradigms come in.

Fog computing is a distributed approach that brings processing and data storage closer to the devices that generate and consume data by extending cloud computing to the network's edge.

As computing technologies are significantly used today, the area has become a hub for researchers to delve deeper into the underlying concepts and devise more and more fog computing frameworks. You can also contribute to and master this architecture by opting for this stand-out topic for your research.

9. Artificial Intelligence (AI)

The field of artificial intelligence studies how to build machines with human-like cognitive abilities and it is one of the  trending research topics in computer science . Unlike humans, AI technology can handle massive amounts of data in many ways. Some important areas of AI where more research is needed include:  

  • Deep learning: Within the field of Machine Learning, Deep Learning mimics the inner workings of the human brain to process and apply judgements based on input.   
  • Reinforcement learning:  With artificial intelligence, a machine can learn things in a manner akin to human learning through a process called reinforcement learning.  
  • Natural Language processing (NLP):  While it is evident that humans are capable of vocal communication, machines are also capable of doing so now! This is referred to as "natural language processing," in which computers interpret and analyse spoken words.  

10. Digital Image Processing

Digital image processing is the process of processing digital images using computer algorithms.  Recent research topics in computer science  around digital image processing are grounded in these techniques. Digital image processing, a subset of digital signal processing, is superior to analogue image processing and has numerous advantages. It allows several algorithms to be applied to the input data and avoids issues like noise accumulation and signal distortion during processing. Digital image processing comes in a variety of forms for research. The most recent thesis and research topics in digital image processing are listed below:  

  • Image Acquisition  
  • Image Enhancement  
  • Image Restoration  
  • Color Image Processing  
  • Wavelets and Multi Resolution Processing  
  • Compression  
  • Morphological Processing  

11. Data Mining

The method by which valuable information is taken out of the raw data is called data mining. Using various data mining tools and techniques, data mining is used to complete many tasks, including association rule development, prediction analysis, and clustering. The most effective method for extracting valuable information from unprocessed data in data mining technologies is clustering. The clustering process allows for the analysis of relevant information from a dataset by grouping similar and dissimilar types of data. Data mining offers a wide range of trending  computer science research topics for undergraduates :  

  • Data Spectroscopic Clustering  
  • Asymmetric spectral clustering  
  • Model-based Text Clustering  
  • Parallel Spectral Clustering in Distributed System  
  • Self-Tuning Spectral Clustering  

12. Robotics

We explore how robots interact with their environments, surrounding objects, other robots, and humans they are assisting through the research, design, and construction of a wide range of robot systems in the field of robotics. Numerous academic fields, including mathematics, physics, biology, and computer science, are used in robotics. Artificial intelligence (AI), physics simulation, and advanced sensor processing (such as computer vision) are some of the key technologies from computer science.  Msc computer science project topic s focus on below mentioned areas around Robotics:  

  • Human Robot collaboration  
  • Swarm Robotics  
  • Robot learning and adaptation  
  • Soft Robotics  
  • Ethical considerations in Robotics  

How to Choose the Right Computer Science Research Topics?  

Choosing the  research areas in computer science  could be overwhelming. You can follow the below mentioned tips in your pursuit:  

  • Chase Your Curiosity:  Think about what in the tech world keeps you up at night, in a good way. If it makes you go "hmm," that's the stuff to dive into.  
  • Tech Trouble Hunt: Hunt for the tech troubles that bug you. You know, those things that make you mutter, "There's gotta be a better way!" That's your golden research nugget.  
  • Interact with Nerds: Grab a coffee (or your beverage of choice) and have a laid-back chat with the tech geeks around you. They might spill the beans on cool problems or untapped areas in computer science.  
  • Resource Reality Check: Before diving in, do a quick reality check. Make sure your chosen topic isn't a resource-hungry beast. You want something you can tackle without summoning a tech army.  
  • Tech Time Travel: Imagine you have a time machine. What future tech would blow your mind? Research that takes you on a journey to the future is like a time travel adventure.  
  • Dream Big, Start Small:  Your topic doesn't have to change the world on day one. Dream big, but start small. The best research often grows from tiny, curious seeds.  
  • Be the Tech Rebel: Don't be afraid to be a bit rebellious. If everyone's zigging, you might want to zag. The most exciting discoveries often happen off the beaten path.  
  • Make it Fun: Lastly, make sure it's fun. If you're going to spend time on it, might as well enjoy the ride. Fun research is the best research.  

Tips and Tricks to Write Computer Science Research Topics

Before starting to explore these hot research topics in computer science you may have to know about some tips and tricks that can easily help you.

  • Know your interest.
  • Choose the topic wisely.
  • Make proper research about the demand of the topic.
  • Get proper references.
  • Discuss with experts.

By following these tips and tricks, you can write a compelling and impactful computer research topic that contributes to the field's advancement and addresses important research gaps.

Why is Research in Computer Science Important?

Computers and technology are becoming an integral part of our lives. We are dependent on them for most of our work. With the changing lifestyle and needs of the people, continuous research in this sector is required to ease human work. However, you need to be a certified researcher to contribute to the field of computers. You can check out Advance Computer Programming certification to learn and advance in the versatile language and get hands-on experience with all the topics of C# application development.

1. Innovation in Technology

Research in computer science contributes to technological advancement and innovations. We end up discovering new things and introducing them to the world. Through research, scientists and engineers can create new hardware, software, and algorithms that improve the functionality, performance, and usability of computers and other digital devices.

2. Problem-Solving Capabilities

From disease outbreaks to climate change, solving complex problems requires the use of advanced computer models and algorithms. Computer science research enables scholars to create methods and tools that can help in resolving these challenging issues in a blink of an eye.

3. Enhancing Human Life

Computer science research has the potential to significantly enhance human life in a variety of ways. For instance, researchers can produce educational software that enhances student learning or new healthcare technology that improves clinical results. If you wish to do Ph.D., these can become interesting computer science research topics for a PhD.

4. Security Assurance

As more sensitive data is being transmitted and kept online, security is our main concern. Computer science research is crucial for creating new security systems and tactics that defend against online threats.

From machine learning and artificial intelligence to blockchain, edge computing, and big data analytics, numerous trending computer research topics exist to explore. One of the most important trends is using cutting-edge technology to address current issues. For instance, new IoT security and privacy opportunities are emerging by integrating blockchain and edge computing. Similarly, the application of natural language processing methods is assisting in revealing human-computer interaction and guiding the creation of new technologies.

Another trend is the growing emphasis on sustainability and moral considerations in technological development. Researchers are looking into how computer science might help in innovation.

With the latest developments and leveraging cutting-edge tools and techniques, researchers can make meaningful contributions to the field and help shape the future of technology. Going for Full-stack Developer online training will help you master the latest tools and technologies. 

Frequently Asked Questions (FAQs)

Research in computer science is mainly focused on different niches. It can be theoretical or technical as well. It completely depends upon the candidate and his focused area. They may do research for inventing new algorithms or many more to get advanced responses in that field.  

Yes, moreover it would be a very good opportunity for the candidate. Because computer science students may have a piece of knowledge about the topic previously. They may find Easy thesis topics for computer science to fulfill their research through KnowledgeHut. 

There are several scopes available for computer science. A candidate can choose different subjects such as AI, database management, software design, graphics, and many more. 

Profile

Ramulu Enugurthi

Ramulu Enugurthi, a distinguished computer science expert with an M.Tech from IIT Madras, brings over 15 years of software development excellence. Their versatile career spans gaming, fintech, e-commerce, fashion commerce, mobility, and edtech, showcasing adaptability in multifaceted domains. Proficient in building distributed and microservices architectures, Ramulu is renowned for tackling modern tech challenges innovatively. Beyond technical prowess, he is a mentor, sharing invaluable insights with the next generation of developers. Ramulu's journey of growth, innovation, and unwavering commitment to excellence continues to inspire aspiring technologists.

Avail your free 1:1 mentorship session.

Something went wrong

Upcoming Programming Batches & Dates

Course advisor icon

  • Directories
  • Degrees & Programs
  • Admission & Aid
  • Academic & Career Advising
  • Student Life
  • Computer Science and Engineering
  • Department Home
  • Events and Information
  • Computer Engineering (B.S.)
  • Computer Science (B.A.)
  • Computer Science (B.S.)
  • Information Technology and Cybersecurity (B.S.)
  • Computer Science (Minor)
  • Computing and Information Technology (Minor)
  • Combined Undergraduate and Graduate Degrees
  • Computer Science Essentials (Certificate)
  • Cyber Security Analytics (Certificate)
  • Computer Engineering (M.S.)
  • Computer Science (M.S.)
  • Cyber Security (M.S.)
  • Data Science (M.S.)
  • Satisfying the M.S. Requirements
  • Computer Science and Engineering (Ph.D.)
  • Graduate Program Prerequisites
  • Big and Smart Data (Certificate)
  • Cybersecurity Analytics (Certificate)
  • Departmental Honors Program
  • Computer Science Fundamentals Course for Prospective Graduate Students
  • Course Descriptions
  • Amazon Web Services (AWS)
  • Azure for Education
  • Virtual Cyber Security Lab (Courses) (Page has submenu)

Areas of Research

  • Active Funded Research
  • Research Forums
  • Senior Design Projects
  • Cyber Research and Education Center (CREC) (Off-site resource)
  • Kno.e.sis (Off-site resource)
  • Student Clubs and Organizations
  • Boffin Factory Informal Student Learning Environment
  • Forms and Documents
  • Frequently Asked Questions
  • Graduate Research and Teaching Assistantships
  • Satisfactory Academic Progress (Off-site resource)
  • Undergraduate Thesis Information
  • Faculty and Staff Directory
  • Department News

On this page:

  • Assistive Technologies and Learning with Disabilities

Biomedical Informatics

Biomed imaging and visualization, cloud computing, cybersecurity, cyber-physical systems, databases and data mining.

  • Data Science and Analytics

Multimedia Systems and Apps

  • Semantic, Social and Sensor Web
  • Machine Learning and Artificial Intelligence

Wireless Networking and Security

Assistive technologies and learning with disabilities.

"Disabilities can be very traumatic, leading to frustration and depression," according to the American Foundation for the Blind. The rate of unemployment among legally blind individuals of working age residing in the United States greatly exceeds the unemployment rate for individuals with no functional limitations. Clever devices and information technology engineering strategies can be developed to help people overcome barriers to pursue educational and professional opportunities that will allow them to become productive members of the society.

Current Research Projects

  • Reading devices for the blind and visually impaired
  • Navigation devices for the blind
  • Multimodal forms of representation for virtual learning environments
  • Rehabilitation Assistants

Researchers

  • Nikolaos Bourbakis

Research Labs

  • Center of Assistive Research Technologies (CART)

Bioinformatics advances fundamental concepts in molecular biology, biochemistry, and computer science to help further understanding of basic DNA, genes, and protein structures it relates to mechanisms for drug development and treatment of diseases.

  • Metabolomics and toxicology
  • Trends in molecular evolution
  • Automation of forensic DNA analysis
  • Indexing genomic databases
  • Stochastic reaction modeling
  • Search optimization
  • National model for bioinformatics education
  • Disease analysis
  • Travis Doom
  • Guozhu Dong
  • Michael Raymer
  • Tanvi Banerjee
  • T.K. Prasad
  • Bioinformatic Research Group

Biomedical imaging and visualization research has become a very active research field during the last two decades, offering unique solutions for a great variety of biological and biomedical problems. Analysis and visualization of medical images facilitates diagnosis and treatment planning. Visualization systems used as surgical navigation systems enable precise and minimally invasive surgery.

  • Image registration in surgical navigation
  • Segmentation of MR and CT images for spinal surgery
  • Design of a surgical robot assistance for biopsy
  • Detection and visualization of brain shift during brain surgery
  • Automated endoscopic imaging
  • EEG+fMRI Modeling of the Brain
  • Ultrasound Modeling of Human organs (heart, liver)
  • Bio-signatures of in-vivo cells
  • Thomas Wischgoll
  • Advanced Visual Data Analysis (AViDA)

Cloud computing is a major step toward organizing all aspects of computation as a public utility service. It embraces concepts such as software as a service and platform as a service, including services for workflow facilities, application design and development, deployment and hosting services, data integration, and management of software. The cloud platform increases in importance as our industry makes the phase change from in-house data management to cloud-hosted data management to improve efficiency and focus on core businesses. However, like any new technology, there are formidable problems, from performance issues to security and privacy, from metadata management to massively parallel execution.

This is a major part of the Kno.e.sis Research Center.

  • Cloud infrastructure for data management
  • Privacy and security in cloud data management
  • Cloud-based mining and learning algorithms
  • Cloud support for text mining and web search
  • Large-scale natural language modeling and translation
  • Parallel and distributed algorithms for bioinformatics
  • Performance evaluation and benchmarking
  • Database Research Laboratory
  • Bioinformatics Research Group

The Department of Computer Science and Engineering of Wright State University recently received a grant, titled "REU Site: Cybersecurity Research at Wright State University", from the National Science Foundation. This NSF REU site offers a ten-week summer program that aims at providing a diverse group of motivated undergraduates with competitive research experiences in cyber-security research. A variety of projects will be offered in Network Security, Intrusion Detection, Wireless Sensor Network Security, Internet Malware Detection, Analysis, and Mitigation, Software Reverse Engineering and Vulnerability Discovery, and Privacy-Preserving Data Mining. More information of this REU Site can be found at http://reu.cs.wright.edu .  

In addition there are two ongoing projects sponsored by DARPA and ONR for Deepfake techniques, Deep Understanding of Technical Documents, and Computer Security (like memory attacks).

  • Junjie Zhang
  • WSU Cybersecurity Lab

Related Programs

  • Master of Science in Cybersecurity
  • Undergraduate

Cyber-Physical Systems are jointly physical and computational and are characterized by complex loops of cause and effect between the computational and physical components. We focus on the creation of methods by which such systems can self-adapt to repair damage and exploit opportunities and methods by which we can explain and understand how they operate even after having diverged from their original forms. Our current application area the creation of control systems for insect-like flapping-wing air vehicles that repair themselves, in flight, after suffering wing damage.

Click here for more information about Cyber Physical Systems at Wright State University

Data mining is the process of extracting useful knowledge from a database. Data mining facilitates the characterization, classification, clustering, and searching of different databases, including text data, image and video data, and bioinformatics data for various applications. Text, multimedia, and bioinformatics databases are very large and so parallel/distributed data mining is essential for scalable performance.

  • Parallel/distributed data mining
  • Text/image clustering and categorization
  • Metadata for timelining events
  • XML database
  • Data warehousing
  • Biological/medical data mining
  • Data Mining Research Lab

Data Science and Analytics

Mathematical, statistical, and graphical methods for exploring large and complex data sets.  Methods include statistical pattern recognition, multivariate data analysis, classifiers, modeling and simulation, and scientific visualization.

  • Topological Data Analysis
  • Predictive Analytics
  • Michelle Cheatham
  • Machine Learning and Complex Systems Lab
  • Data Science for Healthcare

Multimedia systems offer synergistic and integrated solutions to a great variety of applications related to multi-modality data, such as automatic target recognition, surveillance, tracking human behavior, etc.

  • Object recognition in digital images and video
  • Multimedia content classification and indexing
  • Integrated search and retrieval in multimedia repositories
  • Background elimination in live video
  • Modeling and visualization
  • Biometrics and cyber security
  • Network and security visualization

Semantic, Social and Sensor Webs

The World Wide Web contains rapidly growing amount of enterprise, social, device/sensor/IoT/WoT data in unstructured, semistructured and structured forms. The Semantic Web initiative by the World Wide Web consortium (W3C) of which Wright State University is a member (represented by Kno.e.sis) has developed standards and technologies to associate meaning to data, to make data more machine and human understandable, and to apply reasoning techniques for intelligent processing leading to actionable information, insights, and discovery. Kno.e.sis has one of the largest academic groups in the US in Semantic Web, and its applications for better use and analysis of social and sensor data.

  • Computer assisted document interpretation tools
  • Information extraction from semi-structured documents
  • Semantic Web knowledge representation
  • Semantic sensor web
  • Linked and Big Data

Machine Learning and Artificial Intelligence

Machine learning and artificial intelligence aim to develop computer systems that exhibit intelligent behavior in decision making, object recognition, planning, learning, and other applications that require intelligent assessment of complex information.  Our faculty apply modern tools such as deep neural networks, evolutionary algorithms, statistical inference, topological analysis, and graphical inference models to a wide variety of problems from engineering, science, and medicine.

  • Knowledge Representation and Reasoning
  • Intelligent agents
  • Natural language understanding
  • Evolutionary algorithms and evolvable hardware
  • Autonomous robotic systems
  • Machine learning
  • Fuzzy and neural systems
  • Intelligent control systems
  • Deep Neural Networks

Wireless communication and networking have revolutionized the way people communicate. Currently, there are more than two billion cellular telephone subscribers worldwide. Wireless local area networks have become a necessity in many parts of the globe. With new wireless enabled applications being proposed every day, such as wireless sensor networks, telemedicine, music telepresence, and intelligent web, the potential of this discipline is just being unleashed.

  • Ultra-high speed optical network
  • Wireless sensor network
  • Music telepresence
  • Cognitive radio and dynamic spectrum access
  • Secure protocol and secure processors authentication
  • Cyber-physical systems
  • Network coding

Take the Next Step

Finding the right college means finding the right fit. See all that the College of Engineering and Computer Science has to offer by visiting campus.

[email protected]

Engineering and Computer Science, College of

[email protected]

ASEE Diversity Recognition Program bronze

Departments and Programs

  • Biomedical, Industrial, and Human Factors Engineering
  • Electrical Engineering
  • Mechanical and Materials Engineering
  • Ph.D. in Engineering
  • Success Stories

About Wright State

  • Accreditation
  • National Recognition
  • Quick Facts
  • Academic Calendar

Information For

  • Counseling and Wellness
  • Disability Services
  • Human Resources
  • Information Technology (CaTS)
  • Parking and Transportation

Map of Wright State University Dayton and Lake Campuses

  • Make a Gift
  • Wright State Cares

Wright State University

  • X (formerly Twitter)
  • Copyright © 2024
  • Accessibility
  • Emergency Preparedness
  • Web Support
  • M.Eng. Admissions Requirements
  • M.Eng. Early Admit Options for Cornell Students
  • M.Eng. for International Students
  • M.Eng. Tuition and Funding
  • Design Project and Annual Poster Session
  • M.Eng. Degree Requirements
  • Post-Grad Activities/Career Information
  • Ph.D. Program
  • Undergraduate Programs
  • Schedule a tour

Strategic Research Areas

  • Research Groups, Centers and Labs
  • Undergraduate Research Opportunities
  • Executive Leadership
  • Administrative Staff
  • Faculty Awards and Honors
  • Resources and Groups for ECE Women
  • ECE Advisory Council
  • ECE Connections
  • Giving Opportunities
  • Ways to Give
  • Academic Support
  • Financial Support
  • Mental Health Resources
  • Experience and Employment
  • Undergraduate Services
  • Graduate Services and Activities

Research in Electrical and Computer Engineering covers an extremely broad range of topics. Whether in computer architecture, energy and power systems or in nanotechnology devices, the research conducted in ECE is at the cutting edge of technological and scientific developments. 

Image of a computer chip

  • Computer Engineering

Computer engineering concerns itself with the understanding and design of hardware needed to carry out computation, as well as the hardware-software interface. It is sometimes said that computer engineering is the nexus that connects electrical engineering and computer science. Research and teaching areas with a significant computer engineering component include digital logic and VLSI design, computer architecture and organization, embedded systems and Internet of things, virtualization and operating systems, code generation and optimization, computer networks and data centers, electronic design automation, or robotics.

Related Research Areas

  • Artificial Intelligence
  • Complex Systems, Network Science and Computation
  • Computer Architecture
  • Computer Systems
  • Data Mining
  • Energy and the Environment
  • Rapid Prototyping

Robotics and Autonomy

  • Scientific Computing
  • Sensors and Actuators
  • Signal and Image Processing
  • Statistics and Machine Learning

Robotics and Autonomy image

Robotics at Cornell spans various subareas, including perception, control, learning, planning, and human-robot interaction. We work with a variety of robots such as aerial robots, home and office assistant robots, autonomous cars, humanoids, evolutionary robots, legged robots, snake robots and more. The Collective Embodied Intelligence Lab  works to design and coordination of large robot collectives able to achieve complex behaviors beyond the reach of single robot systems, and corresponding studies on how social insects do so in nature. Major research topics include swarm intelligence, embodied intelligence, autonomous construction, bio-cyber physical systems, human-swarm interaction, and soft robots.

Visit the the  Cornell Engineering Robotics Website  for more.

  • Integrated Circuits
  • Power Electronics
  • Robotics and Autonomy
  • Systems and Networking

People network image

  • Information, Networks, and Decision Systems

This research area focuses on the advancement of research and education in the information, learning, network, and decision sciences. Our research is at the frontier of a wide range of fields and applications, including machine learning and signal processing, optimization and control theory, information theory and coding, power systems and electricity markets, network science, and game theory. The work encompasses theory and practice, with the overarching objective of developing the mathematical underpinnings and tools needed to address some of the most pressing challenges facing society today in energy and climate change, transportation, social networks, and human health. In particular, the Foundations of Information, Networks, and Decision Systems (FIND) group comprises a vibrant community of faculty, postdocs, and students dedicated to developing the mathematical underpinnings and tools needed to address the aforementioned challenges in a principled and theory-guided manner.

  • Biotechnology
  • Computational Science and Engineering
  • Energy Systems
  • Image Analysis
  • Information Theory and Communications
  • Optimization
  • Remote Sensing

Wire array load image

  • Physical Electronics, Devices, and Plasma Science

Work in this area applies the physics of electromagnetism, quantum mechanics, and the solid state to implement devices and systems for applications including energy, quantum technologies, sensing, communication, and computation. Our efforts span theory and development of new electronic and optical devices and materials, micro-electromechanical systems, acoustic and optical sensing and imaging, quantum control of individual atoms near absolute zero temperature, and experiments on high-energy plasmas at temperatures close to those at the center of the sun.    At Cornell ECE, we work on diverse topics aimed at transforming the way we view the world. Our interdisciplinary research reveals fundamental similarities across problems and prompts new research into some of the most exciting and cutting-edge developments in the field.

  • Advanced Materials Processing
  • Astrophysics, Fusion and Plasma Physics
  • High Energy Density, Plasma Physics and Electromagnetics
  • Materials Synthesis and Processing
  • Microfluidics and Microsystems
  • Nanotechnology
  • Photonics and Optoelectronics
  • Semiconductor Physics and Devices
  • Solid State, Electronics, Optoelectronics and MEMs

Chip circuit image

  • Circuits and Electronic Systems

Integrated circuits are ubiquitous and integral to everyday devices, from cellular phones and home appliances to automobiles and satellites. Healthcare, communications, consumer electronics, high-performance scientific computing, and many other fields are creating tremendous new opportunities for innovation in circuits and electronic systems at every level. Research in this area spans topics including analog and mixed signal circuits, RF transceivers, low power interfaces, power electronics and wireless power transfer, and many others. 

  • Micro Nano Systems
  • Optical Physics and Quantum Information Science

Digital brain image

  • Bio-Electrical Engineering

Biological and Biomedical Electrical Engineering (B2E2) consists of both applied and fundamental work to understand the complexity of biological systems at different scales, e.g., from a single neuronal or cancer cell, all the way to the brain or malignant tumor. B2E2 aims to develop new hardware and computational tools to identify, characterize, and treat diseases. In the physical domain, electrical engineering approaches to integrated microsystems lead to new biological and medical sensors. These sensors consist of state-of-the-art ultrasonic, RF, optical, MRI, CT, electrical impedance transducers. 

The integration of sensors, electronics are used to develop implantable and wearable devices, with decreasing size, weight, and power and increased functionality. B2E2 microsystems can help create interfaces for sensing and actuation to help understand the physiological and pathological mechanisms of diseases, and enable advanced robotic interfaces in medicine. Medical devices can generate vast amounts of data, which require both real-time and post-acquisition processing. B2E2 faculty, sometimes in collaboration with medical researchers, develop advanced computational tools to learn from and exploit data and apply artificial intelligence approaches to impact medical practice by improving: early disease detection, disease diagnosis, response to therapy assessment, and guided surgical procedures.

  • Biomedical Imaging and Instrumentation
  • Complex Systems, Network Science and Technology
  • Computer-Aided Diagnosis
  • Nanobio Applications
  • Neuroscience

Computer Graphic

Hardware That Protects Against Software Attacks

ECE's Ed Suh and Zhiru Zhang and CS's Andrew C. Myers aim to develop both hardware architecture and design tools to provide comprehensive and provable security assurance for future computing systems against software-level attacks that exploit seven common vulnerability classes.

Image credit Beatrice Jin

Computer Graphic

Re-architecting Next-Gen Computing Systems

Disaggregated architectures have the potential to increase resource capacity by 10 to 100 times server-centric architectures.

Computer Graphic

Re-imagining Computer System Memories

Interdisciplinary team will provide new insights and an entirely new paradigm for the semiconductor industry in the emerging era of big data.

The Martinez and Zhang Research Groups

Engineers to hack 50-year-old computing problem with new center

Cornell engineers are part of a national effort to reinvent computing by developing new solutions to the “von Neumann bottleneck,” a feature-turned-problem that is almost as old as the modern computer itself.

Professors Dave Hammer and Bruce Kusse looking at the COBRA machine

The Laboratory of Plasma Studies: Uncovering mysteries of high energy density plasma physics

In the basement of Grumman Hall, an x-ray pulse produced by a hot, dense plasma – an ionized gas – lasting only fractions of a microsecond both begins and ends an experiment. Hidden within that fraction of time lies a piece of a puzzle—data that graduate students and staff scientists at the Laboratory of Plasma Studies (LPS) will use to better understand the mysterious physics behind inertial confinement fusion.

Sophia Rocco working on the COBRA machine

Sophia Rocco: Hoping to make the world a better place through a potential renewable energy source

When she was looking at graduate schools, physics major Sophia Rocco thought she would be in a materials science program bridging her interests in electricity and magnetism and novel materials for solar cells. Chancing upon the School of Electrical and Computer Engineering at Cornell, she discovered the Laboratory of Plasma Studies (LPS).

The Laboratory of Plasma Studies with the COBRA machine in the foreground and students in the background

Finding the Ultimate Energy Source: Cornell’s Lab of Plasma Studies

Plasma is one of the four fundamental states of matter, but it does not exist freely on the Earth’s surface. It must be artificially generated by heating or subjecting a neutral gas to a strong electromagnetic field. Located in the basement of Grumman Hall are two large pulse-power generators that create plasma by delivering extremely high currents to ordinary matter for short periods. These generators are part of the  Lab of Plasma Studies  at Cornell University.

Photo credit: Dave Burbank

A schematic, left, of a gallium oxide vertical power field-effect transistor, and a scanning electron microscope image, right, of the transistor, showing a 330-nanometer-wide by 795-nanometer-long channel.

Vertical gallium oxide transistor high in power, efficiency

The research group led by Grace Xing and Debdeep Jena presented research on a new gallium oxide field-effect transistor at a conference at the Massachusetts Institute of Technology May 29-June 1.

Molnar, Xing and Jena

Molnar, Jena and Xing join national consortium to develop future cellular infrastructure

Three Cornell faculty will be part of the newly established $27.5 million ComSenTer, a center for converged terahertz communications and sensing.

Faculty members associated with Cornell NeuroNex

Data on the Brain

The NSF has found a willing partner at Cornell University in this quest to create technologies that will allow researchers to image the brain and the nervous system.

  • Browse Works
  • Engineering

Computer Engineering

Computer engineering research papers/topics, setswana grammar checker for declarative sentences using lstm-recurrent neural network.

Abstract: This research is aimed at developing a Setswana grammar checker for Setswana declarative sentences using Long Short-Term Memory Recurrent neural networks (LSTM-RNNs). The research was motivated by the fact that Setswana is recognized as one of the under-resourced languages in the world and the language lacks Natural language processing (NLP) tools such as grammar checkers; this delays the language’s technological progress. A Setswana grammar checker is a pre-requisite to th...

Case-based reasoning system for prediction of fuel consumption by haulage trucks in open–pit mines

Abstract: The shovel-truck system is commonly used in open-pit mining operations. Truck haulage cost constitutes about 26% of open-pit mining costs as the trucks are mostly powered by diesel whose cost is escalating annually. Therefore, reducing fuel consumption could lead to a significant decrease in overall mining costs. Various methods have been proposed to improve fuel efficiency in open-pit mines. Case-based reasoning (CBR) can be used to estimate fuel consumption by haulage trucks. In ...

Meteorological influence on eLoran accuracy

Abstract: Stringent accuracy requirements need to be met for eLoran deployment in marine navigation and harbour entrance and approach. A good accuracy model is therefore required to predict the positioning accuracy at the user’s receiver locations. Accuracy depends on the variations of additional secondary factors (ASFs) and the primary factor delay. The changes in the air refractive index caused variations in the primary factor (PF) delay of the eLoran signal, and current eLoran accuracy ...

A Model for Providing List of Reliable Providers for Grid Computing

Abstract: Grid computing is an interconnected computer system, where machines share resources that are highly heterogeneous. Reliability is the probability that a process will successfully perform its prescribed task without any failure at a given point of time. Hence, ensuring reliable transactions plays a vital role in grid computing. The main objective of the paper is to develop a reliable and robust two way trust model for the Grid system. Thus the goals of this proposed trust model are ...

Query Processing with Respect to Location in Wireless Broadcasting

Abstract: The wireless communication involves a client server communication i.e. the client needs to send a request for performing a process; it can perform only after the response of the server. Large number of request will result the load balance in the server, which cause process delay. It has been resolved by using wireless broadcast client server communication. To communicate with server the client use fee based cellular type network to achieve a responsible operating range. For avoidin...

A preliminary application of a machine learning model for the prediction of the load variation in three-point bending tests based on acoustic emission signals

Abstract: The load variation during three-point bending (TPB) tests on prismatic Nestos (Greece) marble specimens instrumented by piezoelectric sensors is predicted using acoustic emission (AE) signals. The slope of the cumulative amplitude vs the predicted load curve is potentially useful for determining the forthcoming specimen failure as well as the indirect tensile strength of the material. The optimum artificial neural networks (ANN) model was selected based on a comparison of different...

Enhancing Personalization and Sales Conversion in E-Commerce: The Role of AI in the App Shop Experience

A shopping application that can be easily designed and built using the Flutter framework. “Shop AI” offers customers a good platform and efficient shopping experience in terms of convenience and functionality. this report focusing on the design and development framework. It explores application architecture and shows how Flutter’s versatility helps developers create cross-platform, responsive, and visually appealing user interfaces. Discover the main features of ”Shop AI” by searchi...

Algorithms and Data Structures Part 4: Searching and Sorting (Wikipedia Book 2014)

Searching 1 Search algorithm 1 Linear search 3 Binary search algorithm 6 Sorting 14 Sorting algorithm 14 Bubble sort 25 Quicksort 31 Merge sort 43 Insertion sort 52 Heapsort 59 References Article Sources and Contributors 66 Image Sources, Licenses and Contributors 68

E-Store Project Software Requirements Specification Version <4.0>

  Introduction The introduction of the Software Requirements Specification (SRS) provides an overview of the entire SRS with purpose, scope, definitions, acronyms, abbreviations, references and overview of the SRS. The aim of this document is to gather and analyze and give an in-depth insight of the complete Marvel Electronics and Home Entertainment software system by defining the problem statement in detail. Nevertheless, it also concentrates on the capabilities required by stakeholders and...

Skin Cancer Diagnostics with a Smartphone App

Melanoma is one of the deadliest types of skin cancer and can be difficult to treat when it's advanced. To reduce mortality rates, early detection is key. In order to do this, computer-aided systems have been developed to help dermatologists diagnose the condition. To make it more accessible to the public, researchers are working on creating portable, at-home diagnostic systems. An Android-based smartphone application utilizing image capture, preprocessing, and segmentation was developed to e...

Design Control and Monitoring System Of Water Distribution Networks

ABSTRACT  Nowadays, water distribution network automation is becoming more and more popular day by day due to its numerous advantages, an internet-based water distribution network system focuses on monitoring and controlling water distribution network instance. Water distribution network face wastage of water due to improper water supply management, non-monitoring in real time, this caused scarcity of water, scarcity of water consider mainly problem in cities. water distribution network auto...

2D Radar Antenna Orientation and Control

ABSTRACT The RADAR system is the first step and much more important and it has much more effect than the other step in the RADAR system works. The RADAR is used in different applications and systems, almost of these applications need high precision, and it's depending on the accuracy of RADAR ANTENNA orientation. Much research has been done and different systems designed to obtain a result within a permissible range. Design RADAR ANTENNA orientation and control system using FPGA and stepper m...

Design and Optimization of W-Tailored Optical Fiber

ABSTRACT  This research work studies the temperature effects in W-shaped core refractive index optical fiber. The work designs an optimized W-tailored optical fiber (OWTOF) that checks the effect of rising temperatures in W-tailored optical fiber (WTOF) for better communication. Initially, an introduction to the general concept of optical fibers including tailored optical fibers (TOF), temperature effects and models is presented as a background. This is followed by studies on existing litera...

Design of Digital Image Enhancement System Using Noise Filtering Techniques

ABSTRACT  Noise Reduction in digital image is one of the most important and difficult techniques in image research. The aim of Noise Reduction in digital image is to improve the visual appearance of an image, or to provide a better transform representation for another automated image processing. Many images like medical images, satellite images, aerial images and even real-life photographs suffer from viewing, removing blurring and noise, increasing contrast, etc. Reducing noise from the dig...

Design And Implementation Of An Automated Inventory Management System Case Study: Smart Shoppers Masaka

ABSTRACT The general purpose was to develop an efficient Inventory Management System (IMS) that improves service delivery at Smart shoppers’ Masaka. The main objectives were to collect and analyze user requirements that provide the researchers with enough information of what the system users want the system to accomplish, to design an Automated Inventory Management System, to implement a prototype and to test and validate the designed prototype. The methodology used includes Interviews, Que...

Projects, thesis, seminars, research papers, termpapers topics in Computer Engineering. Computer Engineering projects, thesis, seminars and termpapers topic and materials

Popular Papers/Topics

Building and assembling a computer system, digital combination lock system, design and construction of four-way traffic light, a technical report on student industrial work experience scheme (siwes) on laptop repair, microcontroller based digital code lock, design of a patient heartbeat and temperature monitor using rf, automobile battery preservative, network interconnection devices, smart card technology, design and implementation of a software intercom on lan, design &amp; construction of 8 channel buzzer using microcontroller, appliances control through sms, design and simulation of a secured wireless network (a case study of houdegbe north american university benin), microcontroller based automation system, design and implementation of an internet of things (iot) based smart waste bin for fill level monitoring and biodegradability detection.

Privacy Policy | Refund Policy | Terms | Copyright | © 2024, Afribary Limited. All rights reserved.

Suggestions or feedback?

MIT News | Massachusetts Institute of Technology

  • Machine learning
  • Social justice
  • Black holes
  • Classes and programs

Departments

  • Aeronautics and Astronautics
  • Brain and Cognitive Sciences
  • Architecture
  • Political Science
  • Mechanical Engineering

Centers, Labs, & Programs

  • Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Picower Institute for Learning and Memory
  • Lincoln Laboratory
  • School of Architecture + Planning
  • School of Engineering
  • School of Humanities, Arts, and Social Sciences
  • Sloan School of Management
  • School of Science
  • MIT Schwarzman College of Computing

Modular, scalable hardware architecture for a quantum computer

Press contact :, media download.

Rendering shows the 4 layers of a semiconductor chip, with the top layer being a vibrant burst of light.

*Terms of Use:

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license . You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Rendering shows the 4 layers of a semiconductor chip, with the top layer being a vibrant burst of light.

Previous image Next image

Quantum computers hold the promise of being able to quickly solve extremely complex problems that might take the world’s most powerful supercomputer decades to crack.

But achieving that performance involves building a system with millions of interconnected building blocks called qubits. Making and controlling so many qubits in a hardware architecture is an enormous challenge that scientists around the world are striving to meet.

Toward this goal, researchers at MIT and MITRE have demonstrated a scalable, modular hardware platform that integrates thousands of interconnected qubits onto a customized integrated circuit. This “quantum-system-on-chip” (QSoC) architecture enables the researchers to precisely tune and control a dense array of qubits. Multiple chips could be connected using optical networking to create a large-scale quantum communication network.

By tuning qubits across 11 frequency channels, this QSoC architecture allows for a new proposed protocol of “entanglement multiplexing” for large-scale quantum computing.

The team spent years perfecting an intricate process for manufacturing two-dimensional arrays of atom-sized qubit microchiplets and transferring thousands of them onto a carefully prepared complementary metal-oxide semiconductor (CMOS) chip. This transfer can be performed in a single step.

“We will need a large number of qubits, and great control over them, to really leverage the power of a quantum system and make it useful. We are proposing a brand new architecture and a fabrication technology that can support the scalability requirements of a hardware system for a quantum computer,” says Linsen Li, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on this architecture.

Li’s co-authors include Ruonan Han, an associate professor in EECS, leader of the Terahertz Integrated Electronics Group, and member of the Research Laboratory of Electronics (RLE); senior author Dirk Englund, professor of EECS, principal investigator of the Quantum Photonics and Artificial Intelligence Group and of RLE; as well as others at MIT, Cornell University, the Delft Institute of Technology, the U.S. Army Research Laboratory, and the MITRE Corporation. The paper appears today in Nature .

Diamond microchiplets

While there are many types of qubits, the researchers chose to use diamond color centers because of their scalability advantages. They previously used such qubits to produce integrated quantum chips with photonic circuitry.

Qubits made from diamond color centers are “artificial atoms” that carry quantum information. Because diamond color centers are solid-state systems, the qubit manufacturing is compatible with modern semiconductor fabrication processes. They are also compact and have relatively long coherence times, which refers to the amount of time a qubit’s state remains stable, due to the clean environment provided by the diamond material.

In addition, diamond color centers have photonic interfaces which allows them to be remotely entangled, or connected, with other qubits that aren’t adjacent to them.

“The conventional assumption in the field is that the inhomogeneity of the diamond color center is a drawback compared to identical quantum memory like ions and neutral atoms. However, we turn this challenge into an advantage by embracing the diversity of the artificial atoms: Each atom has its own spectral frequency. This allows us to communicate with individual atoms by voltage tuning them into resonance with a laser, much like tuning the dial on a tiny radio,” says Englund.

This is especially difficult because the researchers must achieve this at a large scale to compensate for the qubit inhomogeneity in a large system.

To communicate across qubits, they need to have multiple such “quantum radios” dialed into the same channel. Achieving this condition becomes near-certain when scaling to thousands of qubits. To this end, the researchers surmounted that challenge by integrating a large array of diamond color center qubits onto a CMOS chip which provides the control dials. The chip can be incorporated with built-in digital logic that rapidly and automatically reconfigures the voltages, enabling the qubits to reach full connectivity.

“This compensates for the in-homogenous nature of the system. With the CMOS platform, we can quickly and dynamically tune all the qubit frequencies,” Li explains.

Lock-and-release fabrication

To build this QSoC, the researchers developed a fabrication process to transfer diamond color center “microchiplets” onto a CMOS backplane at a large scale.

They started by fabricating an array of diamond color center microchiplets from a solid block of diamond. They also designed and fabricated nanoscale optical antennas that enable more efficient collection of the photons emitted by these color center qubits in free space.

Then, they designed and mapped out the chip from the semiconductor foundry. Working in the MIT.nano cleanroom, they post-processed a CMOS chip to add microscale sockets that match up with the diamond microchiplet array.

They built an in-house transfer setup in the lab and applied a lock-and-release process to integrate the two layers by locking the diamond microchiplets into the sockets on the CMOS chip. Since the diamond microchiplets are weakly bonded to the diamond surface, when they release the bulk diamond horizontally, the microchiplets stay in the sockets.

“Because we can control the fabrication of both the diamond and the CMOS chip, we can make a complementary pattern. In this way, we can transfer thousands of diamond chiplets into their corresponding sockets all at the same time,” Li says.

The researchers demonstrated a 500-micron by 500-micron area transfer for an array with 1,024 diamond nanoantennas, but they could use larger diamond arrays and a larger CMOS chip to further scale up the system. In fact, they found that with more qubits, tuning the frequencies actually requires less voltage for this architecture.

“In this case, if you have more qubits, our architecture will work even better,” Li says.

The team tested many nanostructures before they determined the ideal microchiplet array for the lock-and-release process. However, making quantum microchiplets is no easy task, and the process took years to perfect.

“We have iterated and developed the recipe to fabricate these diamond nanostructures in MIT cleanroom, but it is a very complicated process. It took 19 steps of nanofabrication to get the diamond quantum microchiplets, and the steps were not straightforward,” he adds.

Alongside their QSoC, the researchers developed an approach to characterize the system and measure its performance on a large scale. To do this, they built a custom cryo-optical metrology setup.

Using this technique, they demonstrated an entire chip with over 4,000 qubits that could be tuned to the same frequency while maintaining their spin and optical properties. They also built a digital twin simulation that connects the experiment with digitized modeling, which helps them understand the root causes of the observed phenomenon and determine how to efficiently implement the architecture.

In the future, the researchers could boost the performance of their system by refining the materials they used to make qubits or developing more precise control processes. They could also apply this architecture to other solid-state quantum systems.

This work was supported by the MITRE Corporation Quantum Moonshot Program, the U.S. National Science Foundation, the U.S. Army Research Office, the Center for Quantum Networks, and the European Union’s Horizon 2020 Research and Innovation Program.

Share this news article on:

Related links.

  • Quantum Photonics and AI Laboratory
  • Terahertz Integrated Electronics Group
  • Research Laboratory of Electronics
  • Microsystems Technology Laboratories
  • Department of Electrical Engineering and Computer Science

Related Topics

  • Computer science and technology
  • Quantum computing
  • Electronics
  • Semiconductors
  • Electrical Engineering & Computer Science (eecs)
  • National Science Foundation (NSF)

Related Articles

This graphic depicts a stylized rendering of the quantum photonic chip and its assembly process. The bottom half of the image shows a functioning quantum micro-chiplet (QMC), which emits single-photon pulses that are routed and manipulated on a photonic integrated circuit (PIC). The top half of the image shows how this chip is made: Diamond QMCs are fabricated separately and then transferred into ...

Scaling up the quantum chip

MIT researchers have fabricated a diamond-based quantum sensor on a silicon chip using traditional fabrication techniques (pictured), which could enable low-cost quantum hardware.

Quantum sensing on a chip

research paper topics about computer engineering

Toward mass-producible quantum computers

Previous item Next item

More MIT News

20 identical images in a four by five grid show a robotic arm attempting to grasp a cube. Eighteen squares are green, while two are red. At left is an illustration of a black robotic arm attempting to grab a black cube with a question mark on it.

Helping robots grasp the unpredictable

Read full story →

Like Atlas holding up the world, a kneeling blue humanlike figure holds up a sphere with a sun radiating multicolored branches with different kinase groups. The figure, sphere, and background are overlaid with dot-and-line networks and pieces of Rosetta stone text in Greek and Egyptian.

“Rosetta Stone” of cell signaling could expedite precision cancer medicine

Four photos show, on top level, a simulation of a robot hand using a spatula, knife, hammer and wrench. The second row shows a real robot hand performing the tasks, and the bottom row shows a human hand performing the tasks.

A technique for more effective multipurpose robots

Thirteen MIT Corporation members.

MIT Corporation elects 10 term members, two life members

Diane Hoskins speaks on an indoor stage, at a lectern bearing MIT’s logo

Diane Hoskins ’79: How going off-track can lead new SA+P graduates to become integrators of ideas

Melissa Nobles stands at podium while speaking at MIT Commencement.

Chancellor Melissa Nobles’ address to MIT’s undergraduate Class of 2024

  • More news on MIT News homepage →

Massachusetts Institute of Technology 77 Massachusetts Avenue, Cambridge, MA, USA

  • Map (opens in new window)
  • Events (opens in new window)
  • People (opens in new window)
  • Careers (opens in new window)
  • Accessibility
  • Social Media Hub
  • MIT on Facebook
  • MIT on YouTube
  • MIT on Instagram

research paper topics about computer engineering

Special Features

Vendor voice.

research paper topics about computer engineering

Personal Tech

comment bubble on white

Two big computer vision papers boost prospect of safer self-driving vehicles

New chip and camera technology bring closer potential of hands-free road time.

Like nuclear fusion and jet-packs, the self-driving car is a long-promised technology that has stalled for years - yet armed with research, boffins think they have created potential improvements.

Citizens of Phoenix, San Francisco, and Los Angeles are able to take one of Waymo's self-driving taxis, first introduced to the public in December 2020. But they have not been without their glitches. Just last month in San Francisco, for example, one of the taxi service's autonomous vehicles drove down the wrong side of the street to pass a unicycle. In December last year, a Waymo vehicle hit a backwards-facing pickup truck , resulting in a report with the US National Highway Traffic Safety Administration (NHTSA) and a software update.

But this week, not one but two groups of researchers bidding to improve the performance of self-driving cars and other autonomous vehicles have published papers in the international science journal Nature.

A design for a new chip geared towards autonomous vehicles has arrived from China. Tsinghua University's Luping Shi and colleagues have taken inspiration from the human visual system by both combining low-accuracy, fast event-based detection with more accurate, but slower visualization of an image.

research paper topics about computer engineering

The researchers were able to show the chip — dubbed Tianmouc — could process pixel arrays quickly and robustly in an automotive driving perception system.

In a paper published today, the authors said: "We demonstrate the integration of a Tianmouc chip into an autonomous driving system, showcasing its abilities to enable accurate, fast and robust perception, even in challenging corner cases on open roads. The primitive-based complementary sensing paradigm helps in overcoming fundamental limitations in developing vision systems for diverse open-world applications."

In a separate paper, Davide Scaramuzza, University of Zurich robotics and perception professor, and his colleagues adopt a similar hybrid approach but apply it to camera technologies.

Youtube Video

Cameras for self-driving vehicles navigate a trade-off between bandwidth and latency. While high-res color cameras have good resolution, they require high bandwidth to detect rapid changes. Conversely, reducing the bandwidth increases latency, affecting the timely processing of data for potentially life-saving decision making.

To get out of this bind, the Swiss-based researchers developed a hybrid camera combining event processing with high-bandwidth image processing. Events cameras only record intensity changes, and report them as sparse measurements, meaning the system does not suffer from the bandwidth/latency trade-off.

The event camera is used to detect changes in the blind time between image frames using events. Event data converted into a graph, which changes over time and connects nearby points, is computed locally. The resulting hybrid object detector reduces the detection time in dangerous high-speed situations, according to an explanatory video.

Among AI infrastructure hopefuls, Qualcomm has become an unlikely ally

  • GhostStripe attack haunts self-driving cars by making them ignore road signs
  • Boston Dynamics' humanoid Atlas is dead, long live the ... new commercial Atlas
  • US military pulls the trigger, uses AI to target air strikes

In their paper , the authors say: "Our method exploits the high temporal resolution and sparsity of events and the rich but low temporal resolution information in standard images to generate efficient, high-rate object detections, reducing perceptual and computational latency."

They argue their use of a 20 frames per second RGB camera plus an event camera can achieve the same latency as a 5,000-fps camera with the bandwidth of a 45-fps camera without compromising accuracy.

"Our approach paves the way for efficient and robust perception in edge-case scenarios by uncovering the potential of event cameras," the authors write.

With a hybrid approach to both cameras and data processing in the offing, more widespread adoption of self-driving vehicles may be just around the corner. ®

Narrower topics

  • Large Language Model
  • Machine Learning
  • Neural Networks
  • Tensor Processing Unit

Broader topics

  • Self-driving Car

Send us news

Other stories you might like

Ai smartphones must balance promise against hype and privacy concerns, will windows drive a pc refresh everyone's talking about ai, prepare your audits: eu commission approves first-of-its-kind ai act, modernizing storage for the age of ai.

research paper topics about computer engineering

AI chip sales predicted to jump by a third this year – then cool off

Using ai in science can add to reproducibility woes, say boffins, google finally addresses those bizarre ai search results, top ai players pledge to pull the plug on models that present intolerable risk, big brains divided over training ai with more ai: is model collapse inevitable, mit professor hoses down predictions ai will put a rocket under the economy, by 2030, software developers will be using ai to cut their workload 'in half'.

icon

  • Advertise with us

Our Websites

  • The Next Platform
  • Blocks and Files

Your Privacy

  • Cookies Policy
  • Privacy Policy
  • Ts & Cs

Situation Publishing

Copyright. All rights reserved © 1998–2024

no-js

COMMENTS

  1. 1000 Computer Science Thesis Topics and Ideas

    This section offers a well-organized and extensive list of 1000 computer science thesis topics, designed to illuminate diverse pathways for academic inquiry and innovation. Whether your interest lies in the emerging trends of artificial intelligence or the practical applications of web development, this assortment spans 25 critical areas of ...

  2. Computer Science and Engineering

    This conceptual research paper is written to discuss the implementation of the A.D.A.B model in technology -based and technical subjects such as Computer Science, Engineering, Technical and so on ...

  3. Top 100+ Computer Engineering Project Topics [Updated]

    Top 100+ Computer Engineering Project Topics. Design and Implementation of a Simple CPU. Development of a Real-time Operating System Kernel. Construction of a Digital Signal Processor (DSP) Designing an FPGA-based Video Processing System. Building a GPU for Parallel Computing.

  4. Computer Science Research Topics (+ Free Webinar)

    Finding and choosing a strong research topic is the critical first step when it comes to crafting a high-quality dissertation, thesis or research project. If you've landed on this post, chances are you're looking for a computer science-related research topic, but aren't sure where to start.Here, we'll explore a variety of CompSci & IT-related research ideas and topic thought-starters ...

  5. 500+ Computer Science Research Topics

    Computer Science Research Topics. Computer Science Research Topics are as follows: Using machine learning to detect and prevent cyber attacks. Developing algorithms for optimized resource allocation in cloud computing. Investigating the use of blockchain technology for secure and decentralized data storage. Developing intelligent chatbots for ...

  6. Computing Engineering Dissertation Topics

    2022 Computing Engineering Dissertation Topics. Topic 1: An investigation of the blockchain's application on the energy sector leading towards electricity production and e-mobility. Research Aim: This study aims to investigate the applications of blockchain within the energy sector.

  7. Computer Science and Engineering Theses and Dissertations

    Human-centered Cybersecurity Research — Anthropological Findings from Two Longitudinal Studies, Anwesh Tuladhar. PDF. Learning State-Dependent Sensor Measurement Models To Improve Robot Localization Accuracy, Troi André Williams. PDF. Human-centric Cybersecurity Research: From Trapping the Bad Guys to Helping the Good Ones, Armin Ziaie Tabari

  8. Computer science

    Computer science is the study and development of the protocols required for automated processing and manipulation of data. ... Most research efforts in machine learning focus on performance and ...

  9. Undergraduate Research Topics

    Available for single-semester IW and senior thesis advising, 2024-2025. Research Areas: computational complexity, algorithms, applied probability, computability over the real numbers, game theory and mechanism design, information theory. Independent Research Topics: Topics in computational and communication complexity.

  10. software engineering Latest Research Papers

    End To End . Predictive Software. The paper examines the principles of the Predictive Software Engineering (PSE) framework. The authors examine how PSE enables custom software development companies to offer transparent services and products while staying within the intended budget and a guaranteed budget.

  11. Computer Science and Engineering Theses, Projects, and Dissertations

    learn programming in virtual reality? a project for computer science students, benjamin alexander. pdf. lung cancer type classification, mohit ramajibhai ankoliya. pdf. high-risk prediction for covid-19 patients using machine learning, raja kajuluri. pdf. improving india's traffic management using intelligent transportation systems, umesh ...

  12. Computer Science Research Topics

    Computer science research topics can be divided into several categories, such as artificial intelligence, big data and data science, human-computer interaction, security and privacy, and software engineering. If you are a student or researcher looking for computer research paper topics. In that case, this article provides some suggestions on ...

  13. Computer Technology Research Paper Topics

    This list of computer technology research paper topics provides the list of 33 potential topics for research papers and an overview article on the history of computer technology.. 1. Analog Computers. Paralleling the split between analog and digital computers, in the 1950s the term analog computer was a posteriori projected onto pre-existing classes of mechanical, electrical, and ...

  14. computer science Latest Research Papers

    Computer science ( CS ) majors are in high demand and account for a large part of national computer and information technology job market applicants. Employment in this sector is projected to grow 12% between 2018 and 2028, which is faster than the average of all other occupations. Published data are available on traditional non-computer ...

  15. Find Articles & Papers

    Electrical and Computer Engineering Research Resources: Find Articles & Papers ... journals and magazines, standards and by topic. Also provides links to IEEE standards, IEEE spectrum and other sites. INSPEC. ... Contains full-text papers on optics and photonics from SPIE journals and proceedings published since 1990. Approximately 15,000 new ...

  16. Top Ten Computer Science Education Research Papers of the Last 50 Years

    We also believe that highlighting excellent research will inspire others to enter the computing education field and make their own contributions.". The Top Ten Symposium Papers are: 1. " Identifying student misconceptions of programming " (2010) Lisa C. Kaczmarczyk, Elizabeth R. Petrick, University of California, San Diego; Philip East ...

  17. Electrical & Computer Engineering Masters Theses Collection

    6:1 PUMA Arrays: Designs and Finite Array Effects, Michael Lee, Electrical & Computer Engineering. PDF. Protecting Controllers against Denial-of-Service Attacks in Software-Defined Networks, Jingrui Li, Electrical & Computer Engineering. PDF. INFRASTRUCTURE-FREE SECURE PAIRING OF MOBILE DEVICES, Chunqiu Liu, Electrical & Computer Engineering. PDF

  18. Latest Computer Science Research Topics for 2024

    Top 12 Computer Science Research Topics for 2024 . Before starting with the research, knowing the trendy research paper ideas for computer science exploration is important. It is not so easy to get your hands on the best research topics for computer science; spend some time and read about the following mind-boggling ideas before selecting one.

  19. Areas of Research

    Cloud Computing. Cloud computing is a major step toward organizing all aspects of computation as a public utility service. It embraces concepts such as software as a service and platform as a service, including services for workflow facilities, application design and development, deployment and hosting services, data integration, and management of software.

  20. Strategic Research Areas

    Strategic Research Areas. Research in Electrical and Computer Engineering covers an extremely broad range of topics. Whether in computer architecture, energy and power systems or in nanotechnology devices, the research conducted in ECE is at the cutting edge of technological and scientific developments. Research. Strategic Research Areas.

  21. Computer Engineering Books and Book Reviews

    Computer Engineering Research Papers/Topics . Setswana grammar checker for declarative sentences using LSTM-Recurrent Neural Network. Abstract: This research is aimed at developing a Setswana grammar checker for Setswana declarative sentences using Long Short-Term Memory Recurrent neural networks (LSTM-RNNs). The research was motivated by the ...

  22. Modular, scalable hardware architecture for a quantum computer

    We are proposing a brand new architecture and a fabrication technology that can support the scalability requirements of a hardware system for a quantum computer," says Linsen Li, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on this architecture.

  23. International Journal of Advanced Research in Computer and ...

    This paper explores how AI automation is changing cybersecurity audits, showing its many impacts. By looking at current research, we see how AI can improve traditional cybersecurity methods by spotting threats before they become big problems, reacting quickly to any issues, and making organizations stronger against new cyber dangers.

  24. Research accelerates hopes of safer self-driving vehicles

    Like nuclear fusion and jet-packs, the self-driving car is a long-promised technology that has stalled for years - yet armed with research, boffins think they have created potential improvements. Citizens of Phoenix, San Francisco, and Los Angeles are able to take one of Waymo's self-driving taxis, first introduced to the public in December 2020.