A Verilog Programming Learning Assistant System Focused on Basic Verilog with a Guided Learning Method
Abstract
1. Introduction
2. Related Works
2.1. The Importance of Hardware Description Languages
2.2. Research on HDLs and Large Language Models
2.3. Exploration of Instructional Strategies
2.4. Technology-Supported Programming Learning Systems
2.5. Overview of Online HDL Learning Resources
3. Design of the Learning System and Content Structure
3.1. System Interface Introduction
- Source Coding: This section provides a complete Verilog code corresponding to the implementation of a specific digital circuit. The fill-in-the-blank sections request students to supplement the missing code according to the requirements of the question to enhance their understanding of grammar and program logic.
- Answer: This section contains quizzes on concepts related to the code and is designed to assess students’ understanding of grammatical structures, digital circuit behaviors, and key design concepts. Learners must apply their knowledge to interpret the provided code and solve each problem, with the platform’s real-time feedback mechanism immediately verifying the correctness of their answers.
- Related Information: This section offers supplementary learning materials aligned with each exercise, including an annotated example code, in-depth conceptual explanations, and digital-circuit schematics or timing waveforms. Designed to support self-directed studies, it enables students to consult these resources during problem-solving, reinforcing their understanding of programming constructs and digital-logic fundamentals.
3.2. Answering Process
- Concept Acquisition: Students first browse the Reference Material Section and learn the background knowledge, sample codes, or digital circuits and timing waveforms related to the question to grasp the core concepts covered in the question. This stage aims to help students establish basic cognition and lay the foundation for subsequent program reading and problem-solving.
- Code Analysis: After understanding the basic concepts, students move to the Source Code Section to observe and analyze the provided Verilog code. In some questions, the code will include fill-in-the-blank sections, requiring students to infer the missing program fragments to further deepen their understanding of Verilog syntax and circuit operation logic.
- Answer and Submission: Students fill in their answers in the Answer Section. After completing all questions, they click the “Answer” button to submit their answers, as shown in Figure 2. The system will automatically compare the answers entered by students with the correct answers, which are stored in the system in the background, and output the accuracy rate in real time. If there are wrong answers, the system will mark them in red to help students identify and correct them.
- Iterative Learning and Mastery: Students can make multiple attempts until all answers are correct. This system provides instant feedback and correction opportunities, helping students consolidate their learning through repeated practice and ensuring a complete understanding of Verilog syntax and digital circuit concepts.
- Section Score and Tracking: When students have completed all questions in a section, they can enter their personal ID to record their learning outcomes, and the system will output the student’s final answers for the section, as shown in Figure 3. This mechanism can be used for students’ self-assessment and can also serve as a reference for teachers to monitor learning progress and effectiveness.
3.3. Question Generation Process
- Definition of Learning Concepts: According to the course requirements, teachers determine the Verilog syntax, digital circuit concepts, or design methods covered by the questions to ensure that students can master the target knowledge points through answering these questions.
- Verilog Code Generation: According to the defined teaching concepts, teachers write Verilog codes to implement the corresponding digital circuit functions and ensure that the codes match the teaching contents.
- Code Simulation and Verification: Teachers use professional digital circuit simulation tools such as ModelSim, Vivado, and Quartus to run the Verilog program to confirm that its functions and logic are correct, preventing incorrect programs from affecting students’ learning.
- Question Design: Teachers design targeted test questions based on the code contents or related concepts to ensure that students can strengthen their understanding of grammar and logic through answering the questions.
- Ensuring Unique Answers: Teachers ensure that all questions have unique answers to avoid students’ learning outcomes and automatic assessment accuracy being affected by ambiguous answers.
- Supplementary Learning Resources: Teachers add relevant supplementary learning resources, such as concept descriptions, images, tables, or sample code, to help students understand the topic backgrounds and core concepts.
- Automated Conversion to HTML Format: Automated tools are used to convert the contents into HTML formats to generate an interactive web-based quiz environment where students can answer questions directly, ensuring that the platform can provide real-time assessments and visual learning experiences.
3.4. Section Introduction
3.4.1. Section 1: Verilog Introduction
3.4.2. Section 2: Basic Architecture
3.4.3. Section 3: Data Conversion
3.4.4. Section 4: Verilog Operator
3.5. Related Work Comparison
4. Evaluation
4.1. Evaluation Subjects and Methodology Design
4.2. Evaluation Indicators
- First-Attempt Accuracy: This examines the proportion of questions answered correctly upon the student’s initial submission, reflecting intuitive understanding or prior knowledge.
- Final Accuracy: This examines the proportion of questions answered correctly upon the student’s final submission, after multiple attempts, indicating the level of understanding achieved through iterative learning and correction.
- Number of Submissions: This represents the total number of attempts made for each question, which reflects both the student’s level of engagement and the relative difficulty of the item.
4.3. Results Analysis
4.3.1. Results Analysis for Section 1
Test Results of Group 1
Individual Student Performance in Group 1
Group 2 Test Results
Individual Student Performance in Group 2
4.3.2. Section 2 Analysis
Group 1 Test Results
- Question 5: This question lists various situations, including different numbers of bits and different radix systems, and requires students to declare the corresponding numbers in this format. We believe that some students may only be familiar with binary and hexadecimal systems, and are unfamiliar with octal systems.This results in a lower number of correct answers being achieved compared to other questions.
- Question 8: This question explains the “truncation” and “extension” mechanisms that occur when variables of different bit widths are assigned to each other in Verilog. Students need to judge whether the description is correct or incorrect. We think that students may not have paid attention to the difference between signed and unsigned. Students often think that the sign extension will be performed when the bit width is insufficient. However, the sign extension will only be performed when the target variable is signed and the source data has a signed value. Otherwise, the zero extension will occur.
Individual Student Performance in Group 1
Group 2 Test Results
Individual Student Performance in Group 2
4.3.3. Section 3 Analysis
Group 1 Test Results
- Question 5: This question asks participants to convert a binary value into a fixed-point number. The average accuracy of the first submission is 75.6%. This question is more difficult for students who are exposed to fixed-point numbers for the first time. However, the next question, Question 6, asks students to convert fixed-point numbers back to binary values. It can be seen that the correct answer rate for the first answer significantly improved, to 86.8%. It can be concluded that after practice, students’ understanding of the concept can be greatly improved, and they can perform better when they are exposed to similar questions again.
- Question 7: This question asks students to convert a binary value into a corresponding signed number. The first-time correct answer rate is 75.2%. For students who are encountering two complementary numbers for the first time, this question requires them to learn a new concept, so the first-time correct answer rate is lower. Similar to previous situations, such as Questions 5 and 6, Question 8 is a reverse conversion of the results. After obtaining an initial understanding of the concept, the results significantly improve upon a second exposure to similar questions.
Individual Student Performance in Group 1
Group 2 Test Results
Individual Student Performance in Group 2
4.3.4. Section 4: Analysis
Group 1 Test Results
- Question 4: This topic focused on how division is performed in binary, which differs somewhat from traditional decimal division. This unfamiliarity resulted in a lower initial accuracy. However, after students understood the underlying computational method, the final accuracy rose to 99%, indicating that although the problem was challenging, students were able to master the concept through guided practice.
- Question 6: This question addressed signed binary addition, which requires students to account for sign extension and overflow conditions not present in unsigned addition. The need for additional reasoning resulted in a lower first-attempt accuracy. Nevertheless, after multiple attempts and conceptual clarification, students’ final accuracy neared 100%.
- Question 8: Students performed signed multiplication—a more complex process requiring a comprehensive understanding of the binary arithmetic rules. As expected, both the average number of attempts and initial accuracy were less favorable; the final accuracy for this item was 95.3%. This may be attributed to the complexity of the problem, insufficiently detailed guidance, or gaps in the design of the problem, suggesting areas for improvement.
- Question 12: This question involved binary signal concatenation and replication. The first-attempt accuracy was only 48.6%, which is relatively low. Upon review, the item itself did not appear excessively complex. The lower accuracy may have resulted from unclear wording in the question, leading to overlooked details, or from the abstract nature of the concept for beginners.
- Questions 8, 10, 11, and 12: These questions had the lowest final accuracy rates, all below 98%. These topics primarily concerned signed operations, which students found more challenging, as expected given their increased complexity. To address this, future improvements could include breaking down complex concepts into more granular subtopics for smoother learning curves, or increasing the number of practice problems to reinforce understanding through repetition.
Individual Student Performance in Group 1
- Increased Number of Submissions Indicates Greater Learning Challenges: The chart shows that most students required more submissions per question than in previous sections. For the students beyond ID 24, the average number of submissions commonly exceeded two, with those in the latter part of the group needing four to six attempts or more. This reflects the heightened difficulty of this section, which covers a wide range of complex arithmetic and logic operations, necessitating repeated practice to arrive at correct solutions.
- Higher Topic Difficulty Reveals Greater Variability Among Students: The average accuracy of the first submission drops from about 97% for the leading students to approximately 70% for those toward the end, indicating a clear gap in students’ comprehension of operations, bit manipulations, and signed versus unsigned calculations. For some students, even the final accuracy remained quite low, suggesting that certain problems may have been excessively difficult or overly complex, hindering effective problem-solving.
- Summary: Overall, Section 4 represents a significant increase in learning difficulty within the system’s curriculum. The trends in average submission count and first-attempt accuracy suggest that additional instructional supports are warranted for advanced operations, such as more detailed worked examples, targeted error correction strategies, and the pre-class categorization of problem types to help students gradually build the necessary computational skills.
Group 2 Test Results
Individual Student Performance in Group 2
4.3.5. Comparative Analysis
Analysis Across Sections
Analysis of Performance Differences Between Groups
5. Discussion
5.1. Effectiveness and Advantages of Question Design
5.2. Summary and Analysis of Overall Results
5.3. Observations and Future Improvements
- Enhance the design of learner questionnaires to gain deeper insights into potential difficulties encountered during the answering process.
- Incorporate the tracking of response time as a basis for analyzing learning behaviors and progress.
- Implement appropriate restrictions in the testing environment, such as clearly prohibiting collaboration during assessment periods, to ensure the authenticity of the data.
- Introduce a midterm and final exams mechanism to complement ongoing practices and provide more comprehensive and credible evidence of the learning achievements.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Wilson, P. Design Recipes for FPGAs: Using Verilog and VHDL; Newnes: Oxford, UK, 2015. [Google Scholar]
- Brayton, R.K.; Rudell, R.; Sangiovanni-Vincentelli, A.; Wang, A.R. MIS: A multiple-level logic optimization system. IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst. 2004, 6, 1062–1081. [Google Scholar] [CrossRef]
- Tang, Y.; Hare, R.; Ferguson, S. Classroom evaluation of a gamified adaptive tutoring system. In Proceedings of the 2022 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 8–11 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–5. [Google Scholar]
- Hare, R.; Tang, Y.; Ferguson, S. An intelligent serious game for digital logic education to enhance student learning. IEEE Trans. Educ. 2024, 67, 387–394. [Google Scholar] [CrossRef]
- Ismail, I.; Zabidi, M.M.A.; Paraman, N.; Mohd-Yusof, K.; Rahman, N.F.A. Active and project-based learning implementation in a constructively aligned digital systems design course. IEEE Trans. Educ. 2023, 66, 602–611. [Google Scholar] [CrossRef]
- Funabiki, N.; Matsushima, Y.; Nakanishi, T.; Watanabe, K.; Amano, N. A Java programming learning assistant system using test-driven development method. IAENG Int. J. Comput. Sci. 2013, 40, 38–46. [Google Scholar]
- Jin, S.; Funabiki, N.; Fang, T.; Hsieh, P.; Fan, Y.; Kao, W.; Lee, Y. An implementation of grammar-concept understanding problems in Verilog programming learning assistant system. In Proceedings of the 13th International Conference on Information and Education Technology (ICIET), Fukuyama, Japan, 18–20 April 2025; pp. 280–284. [Google Scholar]
- Rodríguez-Vidal, J.; Martínez, R.; García-Beltrán, Á. C-programming self-assessment exercises versus final exams: 12 years of experience. Comput. Appl. Eng. Educ. 2023, 31, 1272–1288. [Google Scholar] [CrossRef]
- Quevedo, J.A.; Maldonado, Y. From MLIR to scheduled CDFG: A design flow for hardware resource estimation. IEEE Embed. Syst. Lett. 2025. Early Access. [Google Scholar] [CrossRef]
- Xu, X.; Peng, X. Research and practice on improving the efficiency of digital chip design description language. In Proceedings of the 2023 9th Annual International Conference on Network and Information Systems for Computers (ICNISC), Wuhan, China, 27–29 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 451–456. [Google Scholar]
- Liu, S.; Fang, W.; Lu, Y.; Wang, J.; Zhang, Q.; Zhang, H.; Xie, Z. Rtlcoder: Fully open-source and efficient LLM-assisted RTL code generation technique. IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst. 2024, 44, 1448–1461. [Google Scholar] [CrossRef]
- Yang, Y.; Teng, F.; Liu, P.; Qi, M.; Lv, C.; Li, J.; Zhang, H.; He, Z. Haven: Hallucination-mitigated LLM for Verilog code generation aligned with HDL engineers. In Proceedings of the 2025 Design, Automation & Test in Europe Conference (DATE), Lyon, France, 31 March–2 April 2025; IEEE: Piscataway, NJ, USA, 2025; pp. 1–7. [Google Scholar]
- Pearce, H. Experiences scaffolding a computer engineering project course to improve student outcomes. In Proceedings of the 2025 IEEE International Symposium on Circuits and Systems (ISCAS), London, UK, 25–28 May 2025; IEEE: Piscataway, NJ, USA, 2025; pp. 1–5. [Google Scholar]
- Chowdhury, S.R.; Wardhan, H.; Karri, S.K.; Kode, S.; Nagaraju, K. Smart learning environments for teaching electronics to students. In Proceedings of the 2013 IEEE Fifth International Conference on Technology for Education (T4E), Kharagpur, India, 18–20 December 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 174–175. [Google Scholar] [CrossRef]
- Hoefer, J.; Gauß, M.; Adams, M.; Kreß, F.; Kempf, F.; Karle, C.; Harbaum, T.; Barth, A.; Becker, J. A challenge-based blended learning approach for an introductory digital circuits and systems course. In Proceedings of the 2024 IEEE International Symposium on Circuits and Systems (ISCAS), Singapore, 19–22 May 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–5. [Google Scholar]
- Chen, S.L.; Lai, Y.K.; Hu, W.C.; Chung, W.Y. Case-based instruction of digital integrated circuit design courses for non-major undergraduates. In Proceedings of the 2013 Learning and Teaching in Computing and Engineering, Macau, Macao, 21–24 March 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 172–177. [Google Scholar]
- Pădurean, V.-A.; Denny, P.; Gotovos, A.; Singla, A. Prompt programming: A platform for dialogue-based computational problem solving with generative AI models. arXiv 2025, arXiv:2503.04267. [Google Scholar]
- Hrvacević, L.; Cincović, J.; Milaković, A.; Jocović, V.; Matveiev, V.; Drašković, D. Development of a web system with an automated question generator based on large language models. In Proceedings of the 2024 11th International Conference on Electrical, Electronic and Computing Engineering (IcETRAN), Nis, Serbia, 3–6 June 2024; pp. 1–6. [Google Scholar]
- Perera, P.; Tennakoon, G.; Ahangama, S.; Panditharathna, R.; Chathuranga, B. A systematic mapping of introductory programming languages for novice learners. IEEE Access 2021, 9, 88121–88136. [Google Scholar] [CrossRef]
- El-Medany, W.M.; Ismail, Y. Mobile learning laboratory for hardware courses. In Proceedings of the 2013 Fourth International Conference on e-Learning “Best Practices in Management, Design and Development of e-Courses: Standards of Excellence and Creativity”, Manama, Bahrain, 7–9 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 51–54. [Google Scholar]
- Lee, K. The DEVBOX Development Education Platform: An Environment for Introducing Verilog to Young Students. Master’s Thesis, University of British Columbia, Vancouver, BC, Canada, 2016. [Google Scholar]
- Zhang, H.; Lu, J.; Xu, W. Verilog OJ: Research and implementation of a Verilog HDL online judge system. In Proceedings of the 2023 IEEE International Conference on Image Processing and Computer Applications (ICIPCA), Changchun, China, 1–13 August 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1398–1401. [Google Scholar]
- Al Amin, R.; Wiese, V.; Jacobs, S.; Hardebusch, T.; Jaschke, S.; Obermaisser, R. Requirement analysis and didactic evaluation of a collaborative remote laboratory for FPGAs. In Proceedings of the 2025 IEEE Global Engineering Education Conference (EDUCON), London, UK, 22–25 April 2025; IEEE: Piscataway, NJ, USA, 2025; pp. 1–5. [Google Scholar]
Topic | # of Questions |
---|---|
Start | 6 |
Simple wire | 5 |
Three wires | 6 |
Four wires | 9 |
Not Gate | 5 |
And Gate | 5 |
Or Gate | 6 |
Xor Gate | 6 |
DFF | 4 |
Always Example | 6 |
Always Block Compare | 8 |
Combinational | 6 |
Sequential | 7 |
Combinational Example | 6 |
Sequential Example | 6 |
Wire and Reg 1 | 5 |
Wire and Reg 2 | 7 |
Wire and Reg 3 | 6 |
Design Level | 6 |
Design Level Example 1 | 5 |
Design Level Example 2 | 5 |
Design Level Example 3 | 7 |
Total number of questions | 132 |
Topic | # of Questions |
---|---|
4 Values | 6 |
Register | 8 |
Comment | 8 |
Value Declaration 1 | 14 |
Value Declaration 2 | 9 |
Vector | 15 |
Array | 15 |
Truncation and Extension | 8 |
Identifiers | 10 |
Gate Operator | 10 |
Dataflow Operator | 7 |
Dataflow Operator 2 | 10 |
Operator | 8 |
Total number of questions | 128 |
Topic | # of Questions |
---|---|
Concept | 11 |
Bin <=> Dec | 10 |
Bin <=> Hex (1) | 10 |
Bin <=> Hex (2) | 10 |
Bin => Fixed point | 5 |
Fixed => Bin | 5 |
Bin => Signed Dec (2’compliment) | 5 |
Signed Dec => Bin (2’compliment) | 7 |
Total number of questions | 63 |
Topic | Number |
---|---|
Unsigned Arithmetic (+[add]) | 6 |
Unsigned Arithmetic (−[sub]) | 5 |
Arithmetic (*[multiply]) | 5 |
Arithmetic (/[divide]) | 5 |
Arithmetic (%[modulus]) | 5 |
Signed Arithmetic (+[add]) | 4 |
Signed Arithmetic (−[sub]) | 4 |
Signed Arithmetic (*[multiply −1]) | 8 |
Signed Arithmetic (*[multiply −2]) | 4 |
Signed Arithmetic (/[divide]) | 4 |
Signed Arithmetic (%[modulus]) | 4 |
Concatenate and Replicate | 6 |
Logic Shift (<<[left], >>[right]) | 5 |
Arithmetic Shift (<<<[left], >>>[right]) | 5 |
Relational (<[less], >[greater]) | 6 |
Relational (<=[less], >=[greater]) | 6 |
Equality (==[equal], !=[inequal]) | 6 |
Binary bit-wise (~[not]) | 5 |
Binary bit-wise (&[and], |[or]) | 6 |
Binary bit-wise (^[xor]) | 5 |
Binary Logical (&&) | 4 |
Binary Logical (||,^^) | 5 |
Binary Logical vs Binary bit-wise | 6 |
Total number of questions | 119 |
Method | Function and Objective | Advantages |
---|---|---|
HDLBITS | Provides online Verilog coding and real-time testing. Enables students to practice syntax and structural design. | No software installation required. Automatic online testing. A wide variety of project exercises. |
Mobile Learning Laboratory | Develops a portable hardware design platform that allows students to perform remote hardware design simulations via mobile devices. | Supports m-learning. Enables remote simulations. Highly accessible. Graphical user interface. |
DEVBOX | Develops a fully integrated software-based educational platform tailored for instructional dissemination. | Supports C/C++ and Verilog. Easy to install. Platform-independent. Low-cost and extendable. |
Verilog OJ | Establishes an online judge system for introductory hardware design projects. | Supports online compilation. Includes simulation and synthesis. Immediate error feedback with waveform visualization. Supports multiple question types. |
FPGARLab | Implements a collaborative remote FPGA laboratory platform. | Supports multiple HDL languages. Supports stepwise learning. Support 24 h remote operations. |
VLPAS | Provides a guided online Verilog learning platform tailored for beginners. | Guided instruction. Structured section organization. Abundant exercises. Real-time feedback mechanism. |
Section | FirstAvg Acc. | LastAvg Acc. | SubmitAvg Times |
---|---|---|---|
Section 1 Verilog Introduction | 86.57% | 99.76% | 2.28 |
Section 2 Basic Architecture | 77.64% | 99.72% | 3.13 |
Section 3 Data Conversion | 81.18% | 99.75% | 1.92 |
Section 4 Verilog Operator | 71.17% | 97.32% | 3.18 |
Group | FirstAvg Acc. | LastAvg Acc. | SubmitAvg Times |
---|---|---|---|
Group 1 | 84.82% | 99.34% | 2.12 |
Group 2 | 74.01% | 98.94% | 3.13 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hsieh, P.-C.; Fang, T.-L.; Jin, S.; Wang, Y.; Funabiki, N.; Fan, Y.-C. A Verilog Programming Learning Assistant System Focused on Basic Verilog with a Guided Learning Method. Future Internet 2025, 17, 333. https://doi.org/10.3390/fi17080333
Hsieh P-C, Fang T-L, Jin S, Wang Y, Funabiki N, Fan Y-C. A Verilog Programming Learning Assistant System Focused on Basic Verilog with a Guided Learning Method. Future Internet. 2025; 17(8):333. https://doi.org/10.3390/fi17080333
Chicago/Turabian StyleHsieh, Pin-Chieh, Tzu-Lun Fang, Shaobo Jin, Yuyan Wang, Nobuo Funabiki, and Yu-Cheng Fan. 2025. "A Verilog Programming Learning Assistant System Focused on Basic Verilog with a Guided Learning Method" Future Internet 17, no. 8: 333. https://doi.org/10.3390/fi17080333
APA StyleHsieh, P.-C., Fang, T.-L., Jin, S., Wang, Y., Funabiki, N., & Fan, Y.-C. (2025). A Verilog Programming Learning Assistant System Focused on Basic Verilog with a Guided Learning Method. Future Internet, 17(8), 333. https://doi.org/10.3390/fi17080333