International Journal of Evaluation and Research in Education (IJERE)
Vol. 12, No. 2, June 2023, pp. 790~797
ISSN: 2252-8822, DOI: 10.11591/ijere.v12i2.22886  790
Journal homepage: http://ijere.iaescore.com
Evaluating factors affecting attitudes of IT-intensive and
non-IT-intensive students towards e-assessment
Bao Linh Tran, Lan Phuong Nguyen
Department of Applied Economics, Institute of Interdisciplinary Social Sciences, Nguyen Tat Thanh University, Ho Chi Minh City,
Vietnam
Article Info ABSTRACT
Article history:
Received Sep 10, 2021
Revised Nov 30, 2022
Accepted Jan 3, 2023
Despite the significant shift to distance computer-based test as an inevitable
outcome of Industry 4.0 and the public lockdown of COVID-19, little effort
has been made to research this new testing mode. To address this issue, this
study targets two groups of information technology (IT)-intensive and non-
IT-intensive students with an aim of investigating factors that effectively
encourage each group to adopt online assessment and whether their majors
cause any differences in the students’ attitudes. Based on the student
perception of e-assessment questionnaire (SPEAQ) with some slight
modifications, a final 28-item survey was formed and distributed to 400
students. Results have shown that the factors of security, and affective
factors were the top factors to impact both groups of students, while the
impact of validity and practicality varied among the two groups and
reliability and teaching-learning were at the bottom. Besides, there were no
noticeable differences in the attitudes of students coming from different
majors.
Keywords:
COVID-19
Digital transformation
E-assessment
Students’ attitudes
Vietnam
This is an open access article under the CC BY-SA license.
Corresponding Author:
Bao Linh Tran
Department of Applied Economics, Institute of Interdisciplinary Social Sciences,
Nguyen Tat Thanh University
Nguyen Tat Thành, Ho Chi Minh City 70000, Vietnam
Email: tranbaolinh0108@gmail.com
1. INTRODUCTION
Enhancing the quality of teaching and learning has always been a national priority in Vietnam [1].
Over the years, Vietnamese governments have invested more than 20% of total budget expenditures in
educational innovation projects, among which technology application is of great pre-eminence [2]. Currently,
with an aim of responding to both COVID-19 and the current digital transformation, computer-based
assessments have been applied to academic institutions in Vietnam more intensively than ever. However, this
strategy still receives many opposing views as some institutions claim that computer-based assessment is still
a newly unguided concept that might cost a huge sum of the national budget for implementation [3].
Moreover, numerous issues of this new assessment mode such as cheating, technical problems, and students’
unwillingness to adopt still remained unsolved [4]. Regarding research gaps, whilst many studies have been
done on the part of instructors, e-learning experts and educational technologists, little has been known
regarding students’ perception [5], especially in the computer-based test field [6], [7]. It is strongly posited
that educationalists needed to study the test-takers’ attitudes towards e-assessment. If students failed to have
reliability in the test, their unwilling participation might seriously affect the learning outcomes [8].
Acknowledging the importance, the research decided to carry out a study utilizing the student
perception of e-assessment questionnaire (SPEAQ) [9] to contribute more reliable findings towards this yet to
Int J Eval & Res Educ ISSN: 2252-8822 
Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran)
791
be rich research stream. Specifically, this study aims to propose a new measurement scale based on SPEAQ
and the outcome variable of attitude; to identify which factors of e-exams mostly affect students’ perception;
to explore any differences in the mindset of students majoring in information technology (IT) and their
counterparts; and to propose several approaches for future innovative plans based on the statistical findings
of this research.
To fulfil these objectives, a qualitative study from experts and a quantitative study from students at
IT faculty and others at Nguyen Tat Thanh University is used. Nguyen Tat Thanh University is a private
university in the South of Vietnam which is well-known for its dedicated effort in digital educational
innovations. As stated on the school’s website, the school has their own e-learning institute whose objectives
are to open learning opportunities for students at anytime and anywhere, provide up-to-date materials with
quick access, and utilize technology to enhance students’ learning and researching autonomy. During
COVID-19, the e-learning institute tried hard to avoid interruptions in students’ learning and assessment
process by providing the learning form of e-learning. Regarding time, the research is carried out from March
2021 when both on-campus and off-campus students have at least one semester to experience their online
training programs. Therefore, the school is an ideal place for research implementation.
2. LITERATURE REVIEW
2.1. Online assessment
Together with various types of computer-based teaching modules such as e-learning and blended
learning, the integration of technology into assessment procedures seems inevitable with a typical example of
electronic assessment. E-assessment is defined as an act of using technological tools to aid the school in
assessing students’ performance [9]. Meanwhile, e-exams are also seen as a type of assessments which store,
deliver, record, and record student’s marks and feedback. This process is supported by various technological
gadgets such as laptops, desktop computers, smartphones, iPads, and tablets. Different delivering formats of
Word documents, pdf, videos, photos, and simulations can also be used instead of paper [10]. From the
functional aspect, e-assessment can be categorized into different types depending on the test’s purposes.
Considering their functions, e-assessments could be under the forms of diagnostic, formative and summative
tests. And they are the power of instant marking and feedback that help e-assessments become far superior to
their counterpart. Based on the time and place to take a test, e-assessment can be divided into synchronous
and asynchronous assessments. Synchronous tests are often used for high-stakes exams, which require
everything to run smoothly. Thus, students are asked to sit exams at the same time in one or more on-site PC
labs. Meanwhile, asynchronous e-exams are often used for low-stakes, resource-intensive and un-invigilated
tests. Thus, these tests can be done at any time with/without a specific time allotment [11].
2.2. Student attitudes to use e-assessment
The attitude to adopt technology has widely been studied due to its dramatic influence on customer
satisfaction, and behavior intention, which later generates the actual use of digital products or services [12],
[13]. In the field of e-assessment, the attitudes and behaviors of test-takers should be critically and
thoroughly investigated to guarantee the test’s face validity, students’ engagement and cooperation levels
[14]. Perceiving the same viewpoint, SPEAQ was introduced by Dermo in order to identify the factors that
have caused the most significant effect on the perception of students at Bradford University, England [15].
However, due to the lack of the outcome variable, students’ attitudes were only measured by the mean values
of each item on the Likert 5-point scale to see whether students had positive (mean>3.25) or negative
responses (mean<2.75). Succeeding researcher came up with the idea of combining SPEAQ with the
outcome variable of ‘willingness to adopt’, which successfully enabled her to further identify factors
significantly/insignificantly affecting students’ attitudes of example in a private university of India [16].
In the context of Vietnam, e-assessment has just started to gain their popularity recently during
COVID-19. Thus, no research has been found on attitudes of Vietnamese students towards this trend. For
these reasons, the present study proposed a new version of scale based on SPEAQ and the outcome variable
of attitude. When it comes to the topic of student attitudes’ antecedents, six factors including affective
factors, validity, practicality, reliability, security and pedagogy were suggested to be reliable measurements
[15]. Among them, the affective factors refer to candidates’ feelings during the exam time, with the main
concentration on the sense of ease and comfort. Validity is related to the appropriateness of exam delivery
modes to specific curriculum design and university studies. Practicality concerns the advantages and barriers
of e-assessment as well as its practical applications. Reliability is defined as users' trust in the accuracy and
fairness of e-assessment compared to paper-based assessment. Security can be interpreted as users’
assertiveness about the ability of the devices to protect them from e-cybercrime. Finally, pedagogy is to do
with the effectiveness of e-assessment on the teaching-learning process.
 ISSN: 2252-8822
Int J Eval & Res Educ, Vol. 12, No. 2, June 2023: 790-797
792
As earlier mentioned, most research projects related to e-assessment used the mean value of 6
proposed antecedents in SPEAQ to measure students’ attitudes. In previous study [15], students responded
positively to all 17 items of six constructs, with security and pedagogy being the two most satisfactory
factors. Being recorded at the mean of 3.051, reliability and fairness failed to achieve positive attitudes. Six
dimensions in the main scale were merged to only two dimensions of “comfort in use” and “learnings
through e-tests”, with the latter obtaining a higher positivity degree [17]. Not using mean value, Pillai and
Prakash depended on T-statistics to identify the influence levels of each construct on learners’ willingness to
adopt computerized tests [16]. After all, besides the insignificant influence of reliability, the researcher
posited that while the other five constructs significantly impacted willingness to adopt. Those previous
findings have, therefore, led us to the following hypotheses: i) Affective factors positively affect students’
attitudes towards e-assessment (H1); ii) Validity positively affect students’ attitudes towards e-assessment
(H2); iii) Practicality factors positively affect students’ attitudes towards e-assessment (H3); iv) Reliability
positively affect students’ attitudes towards e-assessment (H4); v) Security positively affect students’
attitudes towards e-assessment (H5); vi) Teaching and learning positively affect students’ attitudes towards
e-assessment (H6); and vii) The IT major cause no difference on students’ attitude towards e-assessment
(H7). The proposed model illustrating the above hypotheses is constructed as in Figure 1.
Figure 1. The final hypothesized research model
3. RESEARCH METHOD
Using SPEAQ as a reference, a measurement scale has to be developed in the specific context of
Vietnam. To do that, a qualitative study of focus group discussion has been implemented with five experts
holding directing positions. They include Director of E-learning Department, Deputy Director of E-learning
Department, the Dean of Information Technology Faculty, the Deputy Dean of Information Technology
Faculty, and the Director of Interdisciplinary Institute of Social Sciences at the University.
3.1. Qualitative: Focus group discussion
Initially, the discussion panel all decided to change all sentences with negative meanings into
positive meanings. For instance, the statement of “I find it hard to concentrate on the questions when doing
an online exam” into “I can focus on test better when doing online exam” (Table 1). This adjustment helps
produce a consistent mindset in students’ answers with the fifth answer option representing a completely
positive attitude while the first being a completely negative one. However, for the need of creating a trap
question in the survey, only statement AF5 was left unchanged when its meaning is negative and opposite
AF1. This only reverse question is believed to keep the respondents alert but does not cause much trouble for
our data interpretation [18].
Next, the experts went on looking closely at the content of each item to make sure it fits the culture
of Vietnamese universities. For the second construct of reliability, items coded item I4 was removed as the
idea is too general and, therefore, unclear. With the construct of validity, item coded I5 in the original
SPEAQ is removed as being outdated. E-exams questions are now not confined into multiple-choice forms
Int J Eval & Res Educ ISSN: 2252-8822 
Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran)
793
only, but rather in various types such as gap-filling, information matching or even paragraph writing [11]. I2
and I4 have been reworded into VA2 and VA4 to become more apparent. With the construct of Practicality,
item coded I3 in the original SPEAQ is removed with the perception that the length of e-exams is too short to
cause any serious health problems, while I5 have been reworded into PR4 to become more transparent. About
the construct security, only the first observed variable is dismissed as it is nearly impossible to conclude
whether online assessment is equally secure as paper-based tests just by quick evaluation. Therefore, all other
items were kept while the item I1 was removed due to the high potential of receiving inconsiderate answers.
As for the teaching and learning construct, I2, I3, and I4 were dismissed as their meanings are too general
and unclear. Meanwhile, two new benefits of e-assessment to the teaching-learning process are added under
all panel members’ agreement. Finally, the panel made no refinement as all items of regarding the affective
factors and attitude constructs as they are clearly worded. Overall, the contents of refined items can be
viewed in Table 1.
Table 1. Indicators retained in the final model
Constructs Indicators
Affective
factors
AF1 I feel comfortable when doing tests on e-learning
AF2 I can focus on test better when doing online assessments
AF3 Doing tests on computers is better because I get used to it
AF4 I believe that e-exams is going to replace traditional paper-based tests sooner or later
AF5 Doing tests on computers make me feel more stressful
Reliability RE1 Marking by computers is more accurate because computers don’t suffer from human mistakes
RE2 The technology that the university uses to run online exams is reliable
RE3 Online exams don’t favor students with good IT skills because they are done by basic steps
RE4 There is no chance of getting easier e-exams because questions are sorted by a test bank according to their
difficulty
Validity VA1 Online assessment is appropriate for my subject area
VA2 The question types of e-exams are of various forms (matching, gap filling, paragraph writing)
VA3 E-assessments test not only students’ subject knowledge but also computer skills
VA4 University students must have sufficient digital abilities to deal with computer tests
Practicality PR1 Online assessments help save a certain amount of paper
PR2 Technical problems can be forecasted and solved by the school
PR3 It’s an opportunity for students to interact and feel confident with technology
PR4 E-assessment is more accessible as learners can take the test in their own comfortable places
Security SE1 The scores of online assessments are securely saved
SE2 Teachers have sensible methods to control cheating (i.e setting limited time, filming the whole process)
SE3 The online exam system is well protected from hackers
SE4 The information of account and password is well secured
Teaching and
learning
TL1 The potential for immediate feedback could help students learn
TL2 The scores are displayed as soon as the tests are completed, minimizing worrisome wait
TL3 Online assessment goes hand-in-hand with e-learning
TL4 Online assessments benefit students’ experience with international standardized tests
Attitude ATT1 It is desirable for me to undertake e-assessment
ATT2 I think it is good for me to get accustomed to e-assessment
ATT3 Overall, my attitude towards e-assessment is favorable
3.2. Quantitative: Instrument design
After designing a hypothesized model and refining scales that suit the Vietnamese context, all items
would be measured using a five-point Likert scale ranging from 1=strongly disagree to 5=strongly agree.
Following this, the sample size was then calculated to prepare for the next part of formal data collection.
Based on the formulas of [19], [20], the minimum sample size of 140 students is qualified to run both the
EFA and model regression analysis for the results of this 28-item questionnaire. Aiming at comparing the
attitudes of IT-intensive and non-IT-intensive students, 200 questionnaires were distributed to the
Information Technology Faculty and another 200 to random students who do not belong to the IT Faculty.
After the removal of faulty surveys due to either incompletion and/or trap questions, a total data from 151
and 161 surveys were collected from IT-majored and non-IT-majored groups respectively and was then kept
in a strictly secured server before going through the data analysis process.
3.3. Measure assessment
Using the SPSS Version 25, model fitness was assessed by the two tests of reliability and EFA. As
in the former test, the two requirements of Cronbach’s alpha being within the range of 0.5<α<0.95 and
Corrected item-total correlations being higher than 0.3 were applied [21]. The item AF5, whose item-total
correlation was-0.374, was dismissed as it failed to meet the second requirement. After the removal, the test
was re-performed, and all figures were within their threshold levels. The set of data is then entering EFA tests
whose primary purpose is to examine the convergence validity of the scales. The KMO value of all proposed
 ISSN: 2252-8822
Int J Eval & Res Educ, Vol. 12, No. 2, June 2023: 790-797
794
items was 0.827, within the recommended range of 0.5-1.0 [22]. Next comes the Bartlett’s Test of Sphericity.
The p value=0.000<0.05, suggesting a high correlation between variables [23]. With the requirement of being
higher than 50%, the total variance explained which is of 68.277 is acceptable [24]. As for Rotated
component Matrix, the factor loadings of both independent and dependent variables are higher than 0.5.
Furthermore, all constructs are also well-grouped into five clusters with no item leaving its initial
independent group. Thus, those re-constructed factors are eligible for entering the regression analysis.
3.4. Hypothesis testing
In the next stage of data analysis, the model fitness and validity of hypotheses are tested through
multiple regression analysis. There are also several criteria examined before the application of the
standardized regression equation. As for IT-intensive group, the Durbin-Watson statistic of 2.157, which is
around 2, proved that this regression model does not suffer from the first-order autocorrelation [21]. Next, the
p value of the F-test at 0.000<0.05 indicated that the independent variables in the model have a linear
correlation with the dependent variable. Lastly, the VIF values of 1.024 to 2.418 are all lower than 3,
indicating that the regression model does not violate the phenomenon of multicollinearity [21].
Referring to non-IT-intensive group, the Durbin-Watson statistic was recorded at 2.157, the p value
was 0.000, and VIF values slightly fluctuate between 1.164 and 1.615. Overall, these very last coefficients
also met the requirements of multivariate regression analysis. Through multiple regression analysis, the
extent of the six proposed factors impacting students’ attitudes in IT-majored group and non-IT group is
presented in Table 2 and Table 3, respectively.
Table 2. Coefficients of IT-intensive group
Model
Unstandardized coefficients Standardized coefficients
t Sig.
Collinearity statistics
B Std. Error Beta Tolerance VIF
1 (Constant) 0.324 0.446 0.725 0.469
AF 0.247 0.068 0.252 3.615 0.000 0.599 1.670
VA 0.433 0.102 0.358 4.260 0.000 0.414 2.418
PR 0.092 0.078 0.081 1.173 0.243 0.616 1.623
RE 0.010 0.063 0.010 0.154 0.878 0.633 1.579
SE 0.331 0.093 0.269 3.550 0.001 0.510 1.960
TL 0.056 0.053 0.057 -1.046 0.297 0.977 1.024
Table 3. Coefficients of non-IT-intensive group
Model
Unstandardized coefficients Standardized coefficients
t Sig.
Collinearity statistics
B Std. Error Beta Tolerance VIF
1 (Constant) 1.084 0.649 1.669 0.097
AF 0.509 0.080 0.486 6.373 0.000 0.619 1.615
VA 0.139 0.096 0.109 1.452 0.149 0.640 1.564
PR 0.213 0.119 0.122 1.794 0.075 0.779 1.283
RE 0.071 0.080 0.062 0.890 0.375 0.748 1.338
SE 0.224 0.083 0.186 2.709 0.008 0.762 1.312
TL 0.018 0.082 0.014 0.221 0.825 0.859 1.164
4. RESULTS AND DISCUSSION
Utilizing a scenario-based survey with 312 student volunteers and multivariate regression analysis,
all hypotheses and 28 items have been proved to work well except for AF5, which was intended to use as a
research trap question. Overall, all five proposed components of SPEAQ positively affect students'
satisfaction. This research outcome is in line with the results found by [17], [25], [26]. From standardized
coefficient Beta (β), the effects of each identified determinant towards students of two research groups are
interpreted in subsection.
4.1. Affective factors
Affective factors were found to significantly affect both two research groups. Thus, we confirmed
the finding of [27], [28], which elucidated that the perception of hedonic attributes is an excellent catalyst for
academic efforts, attention, engaged emotions, and engaged behaviors of learners. Researching the context of
Vietnamese secondary schools and high schools, the result of Tang, Nguyen, and Tran [29] further concluded
that affective factors not only directly affected learners’ attitudes, but also indirectly impacted their plans,
intentions and commitments to a planned behavior in the future. Through our empirical finding, we confirm
that this explanation also applies to the higher educational context of Vietnam.
Int J Eval & Res Educ ISSN: 2252-8822 
Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran)
795
4.2. Validity
Validity is found to significantly affect the IT group, but moderately affect the other. As our work is
the first research following this trend, we could not find any previous support and hence, had to rely on our
experience of university students’ attitudes. It is self-explanatory that IT students would consider
e-assessments tests more valid to their subject area as all their tests throughout the course are computer-
based. This is why they also have higher expectations of digital literacy in the collegiate environment.
Meanwhile, non-IT students usually have fewer interactions with computers during their study; thus, may not
highly appreciate the validity of e-assessments.
4.3. Practicality
Contrasting to validity, practicality is regarded as a modest determinant of attitudes in the IT group,
but a profound one in the non-IT group. Previous evidence is also unavailable to support our outcome as they
did not follow this comparison method. However, investigating high school students in general, the study of
[30], [31] both affirmed that practicality of e-assessment was a significant element in students’ perspective
for the reason that with less confidence in their own digital skills, non-IT students tend to concern more about
instant supports of responsible staff to fix technical problems and higher chances to interact with computers.
Therefore, this might explain why practicality turned out to be more critical for the attitudes of non-IT
students than their counterparts in this study.
4.4. Reliability
Reliability was reported to play an insignificant positive role in the attitudes of both groups. This
outcome is opposite to that of [15], [16]. Based on our insights into Vietnamese students, we suppose there
were three main reasons endorsing our findings. Firstly, since students virtually took asynchronous tests
during our research time, they could actively take control of their own computer quality. Moreover, students
were always provided sufficient time to check the results and discuss with lecturers for any queries related to
the tests. Lastly, as most of the online tests contained a huge number of short-answer questions, it is hard for
students to either remember the test contents or compare the difficulties of their own tests with their peers. In
the short run, more in-depth interviews on other universities are encouraged to carry out to examine these
elucidations.
4.5. Security
Security is determined to have a profound effect on the attitudes of students in both groups. This
outcome is in line with the finding of [16], [27]. In fact, previous studies have confirmed that when receiving
feelings to security notifications, especially when being asked to react, most users tended to reject online
learning system. Security in online learning is closely connected to authentication, encryption, access control,
managing users and their permissions to ensure secure use and access [32]. Through our empirical finding,
we also confirm that this explanation applies to the higher educational context of Vietnam.
4.6. Teaching and learning
Although the teaching-learning dimension played the least important role in the present study, its
effect was rated highly significant [15], [16]. This contrast might be explained by the fact that the teaching-
learning activities in Vietnam have long been associating with paper-based forms. Therefore, an abrupt shift
to e-assessments due to COVID-19 may fail to provide students sufficient time to experience its benefits on
learners’ outcomes.
Based on the Independent sample T-test results, our study concluded that students at university have
positive attitudes towards e-exams, and discipline does not cause dramatic significance in learners' mindset.
Clusters such as education and software engineering were more in favor of e-exams while others such as
chemistry, mathematics and biology were more reluctant [25]. This might be explained in the way that this
students in this private university have been well-equipped with educational technology in their daily study,
so most of them, regardless of majors, all feel comfortable and familiar with e-exams.
5. CONCLUSION
With all empirical findings, this study has enriched research literature regarding tech-based
assessment from the view of test-takers, the amount of which is still of scarcity in Vietnam. Through the
results focus group discussion, pilot tests and formal survey distribution, this study puts forward helpful
suggestions to better refine the student perceptions of E-assessment Questionnaire, enhancing its applicability
and sustainability for wide-spread implementation. Moreover, an independent variable of attitude was added
to not only identify students’ attitudes by calculating means, but also highlight the most influential factors
affecting attitudes via regression analysis, providing more solid evidence for future policy construction.
 ISSN: 2252-8822
Int J Eval & Res Educ, Vol. 12, No. 2, June 2023: 790-797
796
As for managerial contributions, our findings support that major does not cause great differences in
students’ attitudes, the same guideline, with a strong emphasis on security and affective factors, could be
applied to all departments at school. We recommend Nguyen Tat Thanh University in particular and higher
institutions to construct technology-embedded academic landscapes which can maximize students’ digital
familiarity and proficiency. Furthermore, it is necessary for higher institutions to come up with effective
maintenance policies so that students will less likely face technical problems and discouragement. Regarding
security, it is reasonable for students to strongly demand that their account information and especially scores
have to be kept securely safe as schools’ systems are often hacked for the purpose of tampering scores or
selling personal information. Educational institutions should invest budget in upgrading their information
security management system so that any errors or attacks to program code could be warned timely. About the
issue of cheating, besides common methods such as appointing different students with different tests or
setting time limit, the use of proctored tests for institutions is highlighted. After requiring students to log in
by their school-provided account, proctored tests will confirm students’ identities by smart webcams,
monitoring and flagging whenever there are signs of suspicious activities. All suggestions above are thought
to be of significance to further enhance the school’s capability to deliver e-assessment.
While providing some valuable findings, this study does encounter some limitations. Firstly, the
sample size of non-IT groups remains limited compared to the number of 20,000 students at Nguyen Tat
Thanh University. Therefore, it is suggested that future research could replicate our model (with or without
adaptation) to further investigate each department and come up with more generalized outcomes.
Furthermore, longitudinal studies, together with intensive qualitative designs might provide us more useful
insights into students’ attitudes, and the reasons behind their decisions.
ACKNOWLEDGEMENTS
Author thanks the Potential Academic Staff (PAS) Grant, Provide by Research Management Centre
Research, UTM (PY/2017/00149) for support and funding for this research.
REFERENCES
[1] T. Việt, “High quality education for all should be Vietnam’s priority,” The World Bank, Apr. 2012. [Online]. Available:
https://www.worldbank.org/en/news/feature/2012/04/25/high-quality-education-for-all-should-be-vietnams-priority (accessed Jan.
21, 2022).
[2] A. Kiet, “Vietnam spends 5.8% of GDP on education,” Hanoi Times, Jan. 2019. [Online]. Available:
https://hanoitimes.vn/vietnam-spends-58-of-gdp-on-education-1781.html (accessed Jan. 21, 2022).
[3] T. Huynh, “Concerns about computer-based national high school exams,” (in Vietnamese), Tuoitre News, 2019. [Online].
Available: https://tuoitre.vn/nhieu-ban-khoan-khi-thi-thpt-quoc-gia-tren-may-tinh-20190926200012856 (accessed Jan. 21, 2022).
[4] R. Fuller, V. Joynes, J. Cooper, K. Boursicot, and T. Roberts, “Could COVID-19 be our ‘There is no alternative’ (TINA)
opportunity to enhance assessment?” Medical Teacher, vol. 42, no. 7, pp. 781–786, Jul. 2020, doi:
10.1080/0142159X.2020.1779206.
[5] S. I. Malik and J. Coldwell-Neilson, “Gender differences in an introductory programming course: New teaching approach,
students’ learning outcomes, and perceptions,” Education and Information Technologies, vol. 23, no. 6, pp. 2453–2475, Nov.
2018, doi: 10.1007/s10639-018-9725-3.
[6] A. Zabadi, “Investigate students attitudes toward computer based Test (CBT) at chemistry course,” Archives of Business
Research, vol. 4, no. 6, Dec. 2016, doi: 10.14738/abr.46.2325.
[7] B. Z. Waziri, F. A. Tahir, and M. B. Sani “Students’ perception of computer based approach to examining undergraduate
accounting courses in the University of Maiduguri, Nigeria,” European Journal of Business and Management, vol. 11, no. 8, Mar.
2019, doi: 10.7176/ejbm/11-8-03.
[8] E. Pools and C. Monseur, “Student test-taking effort in low-stakes assessments: evidence from the English version of the PISA
2015 science test,” Large-Scale Assessments in Education, vol. 9, no. 1, p. 10, Dec. 2021, doi: 10.1186/s40536-021-00104-6.
[9] M. Kehal, “Assurance of learning and accreditations in business schools: an AACSB perspective,” Journal of Economic and
Administrative Sciences, vol. 36, no. 1, pp. 82–96, Oct. 2019, doi: 10.1108/jeas-06-2018-0066.
[10] N. Utaberta and B. Hassanpour, “Aligning assessment with learning outcomes,” Procedia - Social and Behavioral Sciences,
vol. 60, pp. 228–235, Oct. 2012, doi: 10.1016/j.sbspro.2012.09.372.
[11] R. Bashitialshaaer, M. Alhendawi, and Z. Lassoued, “Obstacle comparisons to achieving distance learning and applying electronic
exams during COVID-19 pandemic,” Symmetry, vol. 13, no. 1, pp. 1–16, Jan. 2021, doi: 10.3390/sym13010099.
[12] F. D. Davis, “Perceived usefulness, perceived ease of use, and user acceptance of information technology,” MIS Quarterly:
Management Information Systems, vol. 13, no. 3, pp. 319–339, Sep. 1989, doi: 10.2307/249008.
[13] H. Li and J. Yu, “Learners’ continuance participation intention of collaborative group project in virtual learning environment: an
extended TAM perspective,” Journal of Data, Information and Management, vol. 2, no. 1, pp. 39–53, Mar. 2020, doi:
10.1007/s42488-019-00017-8.
[14] M. Bahar and M. Asil, “Attitude towards e-assessment: influence of gender, computer usage and level of education,” Open
Learning, vol. 33, no. 3, pp. 221–237, Sep. 2018, doi: 10.1080/02680513.2018.1503529.
[15] J. Dermo, “e-assessment and the student learning experience: A survey of student perceptions of e-assessment,” British Journal of
Educational Technology, vol. 40, no. 2, pp. 203–214, Mar. 2009, doi: 10.1111/j.1467-8535.2008.00915.x.
[16] K. Rajasekharan Pillai and A. V. Prakash, “Technological leverage in higher education: An evolving pedagogy,” Journal of
International Education in Business, vol. 10, no. 2, pp. 130–146, Nov. 2017, doi: 10.1108/JIEB-09-2016-0034.
Int J Eval & Res Educ ISSN: 2252-8822 
Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran)
797
[17] H. K. Mahmood, F. Hussain, M. Mahmood, R. Kumail, and J. Abbas, “Impact of E-assessment at middle school student’
learning-an empirical study at USA middle school students,” International Journal of Scientific & Engineering Research, vol. 11,
no. 4, pp. 1722–1736, 2020.
[18] M. Liu and L. Wronski, “Trap questions in online surveys: Results from three web survey experiments,” International Journal of
Market Research, vol. 60, no. 1, pp. 32–49, Jan. 2018, doi: 10.1177/1470785317744856.
[19] J. F. Hair, R. E. Anderson, R. L. Tatham, and W. C. Black, Multivariate Data Analysis. Upper Saddle River, N.J: Prentice Hall,
1998.
[20] S. B. Green, “How many subjects does it take to do a regression analysis?” Multivariate Behavioral Research, vol. 26, no. 3,
pp. 499–510, Jul. 1991, doi: 10.1207/s15327906mbr2603_7.
[21] J. F. Hair, G.T.M. Hult, C. Ringle, M. Sarstedt, A primer on partial least squares structural equation modeling (PLS-SEM),
2nd ed. Los Angeles: Sage, 2017.
[22] N. Ul Hadia, N. Abdullah, and I. Sentosa, “An easy approach to exploratory factor analysis: marketing perspective,” Journal of
Educational and Social Research, vol. 6, no. 1, p. 215, Jan. 2016, doi: 10.5901/jesr.2016.v6n1p215.
[23] G. Rasch, Probabilistic models for some intelligence and attainment tests. Chicago: MESA Press, 1993.
[24] J. F. Hair, Jr., R. E. Anderson, R. L. Tatham, and W. C. Black, Multivariate data analysis with readings, 4th ed. Englewood
Cliffs, N.J: Prentice Hall, 1995.
[25] M. Hillier, “The very idea of e-Exams: student (pre) conceptions,” in Proceedings of ASCILITE 2014-Annual Conference
Australian Society for Computers in Tertiary Education, 2014, pp. 77-88.
[26] K. Awad, “Attitude of Ash-Shobak University College Students to E-Exam for Intermediate University Degree in Jordan,”
Journal of Education and Practice, vol. 7, no. 9, pp. 10–17, 2016.
[27] V. J. Shute, J. P. Leighton, E. E. Jang, and M. W. Chu, “Advances in the science of assessment,” Educational Assessment, vol. 21,
no. 1, pp. 34–59, Jan. 2016, doi: 10.1080/10627197.2015.1127752.
[28] D. Abun, T. Magallanes, and M. J. Incarnacion, “College students’ cognitive and affective attitude toward higher education and
their academic engagement,” International Journal of English Literature and Social Sciences, vol. 4, no. 5, pp. 1494–1507, 2019,
doi: 10.22161/ijels.45.38.
[29] T. T. Tang, T. N. Nguyen, and H. T. T. Tran, “Vietnamese teachers’ acceptance to use E-assessment tools in teaching: An
empirical study using PLS-SEM,” Contemporary Educational Technology, vol. 14, no. 3, p. ep375, May 2022, doi:
10.30935/cedtech/12106.
[30] S. S. M. Huda, M. Kabir, and T. Siddiq, “E-assessment in higher education: Students’ perspective,” International Journal of
Education and Development using Information and Communication Technology, vol. 16, no. 2, pp. 250–258, 2020.
[31] M. Appiah and F. V. Tonder, “E-assessment in higher education: A review,” International Journal of Business Management &
Economic Research, vol. 9, no. 6, pp. 1454–1460, 2018.
[32] K. Thamadharan and N. Maarop, “The acceptance of E-assessment considering security perspective: Work in progress,”
International Journal of Social, Behavioral, Economics, Business and Industrial Engineering, vol. 9, no. 3, pp. 874–879, 2015,
doi: 10.5281/zenodo.1099873.
BIOGRAPHIES OF AUTHORS
Bao Linh Tran is a researcher at the Institute of Interdisciplinary Social
Sciences, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam. Bao Linh’s research
interests lie in the tech-based learning, educational assessment, innovations in education,
student satisfaction. She can be contacted at email: tranbaolinh0108@gmail.com.
Phuong Nguyen Lan is PhD of Education Administration, Vice-rector of
Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam. She is one of the founding
members of Nguyen Tat Thanh University. Besides, PhD. Phuong Lan Nguyen also
participates in much interdisciplinary research between economics, management,
technology, and the development of scientific research skills. She can be contacted at email:
nlphuong@ntt.edu.vn.

More Related Content

PDF
Presentation of ola erstad's paper of changing assessment practices and the ...
PPTX
E assessment
PDF
Pres nlc 2013 lit
PDF
An exploration of the cisco online courses a basis for the development of a l...
PDF
Pioneering Online Assessment Solutions: Empirical Experiences from Educationa...
PDF
PIONEERING ONLINE ASSESSMENT SOLUTIONS: EMPIRICAL EXPERIENCES FROM EDUCATIONA...
PDF
Pioneering Online Assessment Solutions: Empirical Experiences from Educationa...
PDF
Pioneering Online Assessment Solutions: Empirical Experiences from Educationa...
Presentation of ola erstad's paper of changing assessment practices and the ...
E assessment
Pres nlc 2013 lit
An exploration of the cisco online courses a basis for the development of a l...
Pioneering Online Assessment Solutions: Empirical Experiences from Educationa...
PIONEERING ONLINE ASSESSMENT SOLUTIONS: EMPIRICAL EXPERIENCES FROM EDUCATIONA...
Pioneering Online Assessment Solutions: Empirical Experiences from Educationa...
Pioneering Online Assessment Solutions: Empirical Experiences from Educationa...

Similar to Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students towards e-assessment (20)

PDF
ICT in Assessment: A Backbone for Teaching and Learning Process
PPTX
Technology & Assessment Oct 2016
PDF
Call for Papers Journal of Educational Technology & Society
PDF
Contribution300 a
PDF
'Stepping into the unknown' - Assessment practices in a digital age
PPTX
Engaging learners in computer-based summative exams: Reflections on a partici...
PDF
ozkan2009.pdf
PPT
Kingsley Osamede Omorogiuwa:Student Learning Assessment in Open and Distance ...
PPTX
online assessment during covid19 .pptx
PPTX
Technology-Enhanced-Assessment-for-Learning.pptx
PPT
He547 unit 7 tech intergration
DOCX
Power point journal
PPTX
E assessment- developing new dialogues for the digital age
PDF
Affective e-learning approaches, technology and implementation model: a syste...
PDF
Transforming assessment for learning in a digital age
PDF
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
PDF
Students' Intention to Use Technology and E-learning probably Influenced by s...
PDF
eAssessment in practice
PPTX
A theoretical framework for e assessment in higher education
PDF
The quasimoderating effect of perceived affective quality on an extending Tec...
ICT in Assessment: A Backbone for Teaching and Learning Process
Technology & Assessment Oct 2016
Call for Papers Journal of Educational Technology & Society
Contribution300 a
'Stepping into the unknown' - Assessment practices in a digital age
Engaging learners in computer-based summative exams: Reflections on a partici...
ozkan2009.pdf
Kingsley Osamede Omorogiuwa:Student Learning Assessment in Open and Distance ...
online assessment during covid19 .pptx
Technology-Enhanced-Assessment-for-Learning.pptx
He547 unit 7 tech intergration
Power point journal
E assessment- developing new dialogues for the digital age
Affective e-learning approaches, technology and implementation model: a syste...
Transforming assessment for learning in a digital age
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
Students' Intention to Use Technology and E-learning probably Influenced by s...
eAssessment in practice
A theoretical framework for e assessment in higher education
The quasimoderating effect of perceived affective quality on an extending Tec...
Ad

More from International Journal of Evaluation and Research in Education (IJERE) (20)

PDF
Technology-based learning interventions on mathematical problem-solving: a me...
PDF
An exploratory study on perceived online learning experience of university st...
PDF
The role of social support on vocational school students’ career choices
PDF
Systematic literature review on the implementation of the Six Sigma approach ...
PDF
Validation of the French version of the classroom assessment scoring system i...
PDF
Visualization of students’ cognitive knowledge in digital concept mapping
PDF
Exploring elementary teacher education students’ perception on parental invol...
PDF
Eco-pesantren modeling for environmentally friendly behavior: new lessons fro...
PDF
Technological, pedagogical, and content knowledge for technology integration:...
PDF
Sustainable leadership practices among school leaders and their relationship ...
PDF
The psychometric properties of students’ attitudes, coping strategies, and ps...
PDF
Sustainable entrepreneurial culture in promoting innovation: a higher educati...
PDF
Structural equation modelling: validation of career readiness model using psy...
PDF
How to conduct paired-t-test SPSS: comprehension in adsorption with bibliometric
PDF
Portrait of students’ language politeness in elementary school
PDF
Motivation mediating effect on principals’ personality, job satisfaction, and...
PDF
Professional and personal traits of the teacher and the relationship with did...
PDF
A scoping review on mapping the digital leadership constructs for educational...
PDF
Return on investment from educational research grant funding: deliverables an...
PDF
Employability of accountancy graduates of a Philippine public university
Technology-based learning interventions on mathematical problem-solving: a me...
An exploratory study on perceived online learning experience of university st...
The role of social support on vocational school students’ career choices
Systematic literature review on the implementation of the Six Sigma approach ...
Validation of the French version of the classroom assessment scoring system i...
Visualization of students’ cognitive knowledge in digital concept mapping
Exploring elementary teacher education students’ perception on parental invol...
Eco-pesantren modeling for environmentally friendly behavior: new lessons fro...
Technological, pedagogical, and content knowledge for technology integration:...
Sustainable leadership practices among school leaders and their relationship ...
The psychometric properties of students’ attitudes, coping strategies, and ps...
Sustainable entrepreneurial culture in promoting innovation: a higher educati...
Structural equation modelling: validation of career readiness model using psy...
How to conduct paired-t-test SPSS: comprehension in adsorption with bibliometric
Portrait of students’ language politeness in elementary school
Motivation mediating effect on principals’ personality, job satisfaction, and...
Professional and personal traits of the teacher and the relationship with did...
A scoping review on mapping the digital leadership constructs for educational...
Return on investment from educational research grant funding: deliverables an...
Employability of accountancy graduates of a Philippine public university
Ad

Recently uploaded (20)

PDF
LIFE & LIVING TRILOGY- PART (1) WHO ARE WE.pdf
PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PDF
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
PDF
International_Financial_Reporting_Standa.pdf
PDF
Literature_Review_methods_ BRACU_MKT426 course material
PDF
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
PPTX
Introduction to pro and eukaryotes and differences.pptx
PDF
semiconductor packaging in vlsi design fab
PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
PPTX
What’s under the hood: Parsing standardized learning content for AI
PDF
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
PDF
AI-driven educational solutions for real-life interventions in the Philippine...
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PDF
Empowerment Technology for Senior High School Guide
PPTX
Module on health assessment of CHN. pptx
PDF
HVAC Specification 2024 according to central public works department
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PDF
English Textual Question & Ans (12th Class).pdf
PPTX
Computer Architecture Input Output Memory.pptx
LIFE & LIVING TRILOGY- PART (1) WHO ARE WE.pdf
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
International_Financial_Reporting_Standa.pdf
Literature_Review_methods_ BRACU_MKT426 course material
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
Introduction to pro and eukaryotes and differences.pptx
semiconductor packaging in vlsi design fab
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
Environmental Education MCQ BD2EE - Share Source.pdf
What’s under the hood: Parsing standardized learning content for AI
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
AI-driven educational solutions for real-life interventions in the Philippine...
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
Empowerment Technology for Senior High School Guide
Module on health assessment of CHN. pptx
HVAC Specification 2024 according to central public works department
Cambridge-Practice-Tests-for-IELTS-12.docx
English Textual Question & Ans (12th Class).pdf
Computer Architecture Input Output Memory.pptx

Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students towards e-assessment

  • 1. International Journal of Evaluation and Research in Education (IJERE) Vol. 12, No. 2, June 2023, pp. 790~797 ISSN: 2252-8822, DOI: 10.11591/ijere.v12i2.22886  790 Journal homepage: http://ijere.iaescore.com Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students towards e-assessment Bao Linh Tran, Lan Phuong Nguyen Department of Applied Economics, Institute of Interdisciplinary Social Sciences, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam Article Info ABSTRACT Article history: Received Sep 10, 2021 Revised Nov 30, 2022 Accepted Jan 3, 2023 Despite the significant shift to distance computer-based test as an inevitable outcome of Industry 4.0 and the public lockdown of COVID-19, little effort has been made to research this new testing mode. To address this issue, this study targets two groups of information technology (IT)-intensive and non- IT-intensive students with an aim of investigating factors that effectively encourage each group to adopt online assessment and whether their majors cause any differences in the students’ attitudes. Based on the student perception of e-assessment questionnaire (SPEAQ) with some slight modifications, a final 28-item survey was formed and distributed to 400 students. Results have shown that the factors of security, and affective factors were the top factors to impact both groups of students, while the impact of validity and practicality varied among the two groups and reliability and teaching-learning were at the bottom. Besides, there were no noticeable differences in the attitudes of students coming from different majors. Keywords: COVID-19 Digital transformation E-assessment Students’ attitudes Vietnam This is an open access article under the CC BY-SA license. Corresponding Author: Bao Linh Tran Department of Applied Economics, Institute of Interdisciplinary Social Sciences, Nguyen Tat Thanh University Nguyen Tat Thành, Ho Chi Minh City 70000, Vietnam Email: tranbaolinh0108@gmail.com 1. INTRODUCTION Enhancing the quality of teaching and learning has always been a national priority in Vietnam [1]. Over the years, Vietnamese governments have invested more than 20% of total budget expenditures in educational innovation projects, among which technology application is of great pre-eminence [2]. Currently, with an aim of responding to both COVID-19 and the current digital transformation, computer-based assessments have been applied to academic institutions in Vietnam more intensively than ever. However, this strategy still receives many opposing views as some institutions claim that computer-based assessment is still a newly unguided concept that might cost a huge sum of the national budget for implementation [3]. Moreover, numerous issues of this new assessment mode such as cheating, technical problems, and students’ unwillingness to adopt still remained unsolved [4]. Regarding research gaps, whilst many studies have been done on the part of instructors, e-learning experts and educational technologists, little has been known regarding students’ perception [5], especially in the computer-based test field [6], [7]. It is strongly posited that educationalists needed to study the test-takers’ attitudes towards e-assessment. If students failed to have reliability in the test, their unwilling participation might seriously affect the learning outcomes [8]. Acknowledging the importance, the research decided to carry out a study utilizing the student perception of e-assessment questionnaire (SPEAQ) [9] to contribute more reliable findings towards this yet to
  • 2. Int J Eval & Res Educ ISSN: 2252-8822  Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran) 791 be rich research stream. Specifically, this study aims to propose a new measurement scale based on SPEAQ and the outcome variable of attitude; to identify which factors of e-exams mostly affect students’ perception; to explore any differences in the mindset of students majoring in information technology (IT) and their counterparts; and to propose several approaches for future innovative plans based on the statistical findings of this research. To fulfil these objectives, a qualitative study from experts and a quantitative study from students at IT faculty and others at Nguyen Tat Thanh University is used. Nguyen Tat Thanh University is a private university in the South of Vietnam which is well-known for its dedicated effort in digital educational innovations. As stated on the school’s website, the school has their own e-learning institute whose objectives are to open learning opportunities for students at anytime and anywhere, provide up-to-date materials with quick access, and utilize technology to enhance students’ learning and researching autonomy. During COVID-19, the e-learning institute tried hard to avoid interruptions in students’ learning and assessment process by providing the learning form of e-learning. Regarding time, the research is carried out from March 2021 when both on-campus and off-campus students have at least one semester to experience their online training programs. Therefore, the school is an ideal place for research implementation. 2. LITERATURE REVIEW 2.1. Online assessment Together with various types of computer-based teaching modules such as e-learning and blended learning, the integration of technology into assessment procedures seems inevitable with a typical example of electronic assessment. E-assessment is defined as an act of using technological tools to aid the school in assessing students’ performance [9]. Meanwhile, e-exams are also seen as a type of assessments which store, deliver, record, and record student’s marks and feedback. This process is supported by various technological gadgets such as laptops, desktop computers, smartphones, iPads, and tablets. Different delivering formats of Word documents, pdf, videos, photos, and simulations can also be used instead of paper [10]. From the functional aspect, e-assessment can be categorized into different types depending on the test’s purposes. Considering their functions, e-assessments could be under the forms of diagnostic, formative and summative tests. And they are the power of instant marking and feedback that help e-assessments become far superior to their counterpart. Based on the time and place to take a test, e-assessment can be divided into synchronous and asynchronous assessments. Synchronous tests are often used for high-stakes exams, which require everything to run smoothly. Thus, students are asked to sit exams at the same time in one or more on-site PC labs. Meanwhile, asynchronous e-exams are often used for low-stakes, resource-intensive and un-invigilated tests. Thus, these tests can be done at any time with/without a specific time allotment [11]. 2.2. Student attitudes to use e-assessment The attitude to adopt technology has widely been studied due to its dramatic influence on customer satisfaction, and behavior intention, which later generates the actual use of digital products or services [12], [13]. In the field of e-assessment, the attitudes and behaviors of test-takers should be critically and thoroughly investigated to guarantee the test’s face validity, students’ engagement and cooperation levels [14]. Perceiving the same viewpoint, SPEAQ was introduced by Dermo in order to identify the factors that have caused the most significant effect on the perception of students at Bradford University, England [15]. However, due to the lack of the outcome variable, students’ attitudes were only measured by the mean values of each item on the Likert 5-point scale to see whether students had positive (mean>3.25) or negative responses (mean<2.75). Succeeding researcher came up with the idea of combining SPEAQ with the outcome variable of ‘willingness to adopt’, which successfully enabled her to further identify factors significantly/insignificantly affecting students’ attitudes of example in a private university of India [16]. In the context of Vietnam, e-assessment has just started to gain their popularity recently during COVID-19. Thus, no research has been found on attitudes of Vietnamese students towards this trend. For these reasons, the present study proposed a new version of scale based on SPEAQ and the outcome variable of attitude. When it comes to the topic of student attitudes’ antecedents, six factors including affective factors, validity, practicality, reliability, security and pedagogy were suggested to be reliable measurements [15]. Among them, the affective factors refer to candidates’ feelings during the exam time, with the main concentration on the sense of ease and comfort. Validity is related to the appropriateness of exam delivery modes to specific curriculum design and university studies. Practicality concerns the advantages and barriers of e-assessment as well as its practical applications. Reliability is defined as users' trust in the accuracy and fairness of e-assessment compared to paper-based assessment. Security can be interpreted as users’ assertiveness about the ability of the devices to protect them from e-cybercrime. Finally, pedagogy is to do with the effectiveness of e-assessment on the teaching-learning process.
  • 3.  ISSN: 2252-8822 Int J Eval & Res Educ, Vol. 12, No. 2, June 2023: 790-797 792 As earlier mentioned, most research projects related to e-assessment used the mean value of 6 proposed antecedents in SPEAQ to measure students’ attitudes. In previous study [15], students responded positively to all 17 items of six constructs, with security and pedagogy being the two most satisfactory factors. Being recorded at the mean of 3.051, reliability and fairness failed to achieve positive attitudes. Six dimensions in the main scale were merged to only two dimensions of “comfort in use” and “learnings through e-tests”, with the latter obtaining a higher positivity degree [17]. Not using mean value, Pillai and Prakash depended on T-statistics to identify the influence levels of each construct on learners’ willingness to adopt computerized tests [16]. After all, besides the insignificant influence of reliability, the researcher posited that while the other five constructs significantly impacted willingness to adopt. Those previous findings have, therefore, led us to the following hypotheses: i) Affective factors positively affect students’ attitudes towards e-assessment (H1); ii) Validity positively affect students’ attitudes towards e-assessment (H2); iii) Practicality factors positively affect students’ attitudes towards e-assessment (H3); iv) Reliability positively affect students’ attitudes towards e-assessment (H4); v) Security positively affect students’ attitudes towards e-assessment (H5); vi) Teaching and learning positively affect students’ attitudes towards e-assessment (H6); and vii) The IT major cause no difference on students’ attitude towards e-assessment (H7). The proposed model illustrating the above hypotheses is constructed as in Figure 1. Figure 1. The final hypothesized research model 3. RESEARCH METHOD Using SPEAQ as a reference, a measurement scale has to be developed in the specific context of Vietnam. To do that, a qualitative study of focus group discussion has been implemented with five experts holding directing positions. They include Director of E-learning Department, Deputy Director of E-learning Department, the Dean of Information Technology Faculty, the Deputy Dean of Information Technology Faculty, and the Director of Interdisciplinary Institute of Social Sciences at the University. 3.1. Qualitative: Focus group discussion Initially, the discussion panel all decided to change all sentences with negative meanings into positive meanings. For instance, the statement of “I find it hard to concentrate on the questions when doing an online exam” into “I can focus on test better when doing online exam” (Table 1). This adjustment helps produce a consistent mindset in students’ answers with the fifth answer option representing a completely positive attitude while the first being a completely negative one. However, for the need of creating a trap question in the survey, only statement AF5 was left unchanged when its meaning is negative and opposite AF1. This only reverse question is believed to keep the respondents alert but does not cause much trouble for our data interpretation [18]. Next, the experts went on looking closely at the content of each item to make sure it fits the culture of Vietnamese universities. For the second construct of reliability, items coded item I4 was removed as the idea is too general and, therefore, unclear. With the construct of validity, item coded I5 in the original SPEAQ is removed as being outdated. E-exams questions are now not confined into multiple-choice forms
  • 4. Int J Eval & Res Educ ISSN: 2252-8822  Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran) 793 only, but rather in various types such as gap-filling, information matching or even paragraph writing [11]. I2 and I4 have been reworded into VA2 and VA4 to become more apparent. With the construct of Practicality, item coded I3 in the original SPEAQ is removed with the perception that the length of e-exams is too short to cause any serious health problems, while I5 have been reworded into PR4 to become more transparent. About the construct security, only the first observed variable is dismissed as it is nearly impossible to conclude whether online assessment is equally secure as paper-based tests just by quick evaluation. Therefore, all other items were kept while the item I1 was removed due to the high potential of receiving inconsiderate answers. As for the teaching and learning construct, I2, I3, and I4 were dismissed as their meanings are too general and unclear. Meanwhile, two new benefits of e-assessment to the teaching-learning process are added under all panel members’ agreement. Finally, the panel made no refinement as all items of regarding the affective factors and attitude constructs as they are clearly worded. Overall, the contents of refined items can be viewed in Table 1. Table 1. Indicators retained in the final model Constructs Indicators Affective factors AF1 I feel comfortable when doing tests on e-learning AF2 I can focus on test better when doing online assessments AF3 Doing tests on computers is better because I get used to it AF4 I believe that e-exams is going to replace traditional paper-based tests sooner or later AF5 Doing tests on computers make me feel more stressful Reliability RE1 Marking by computers is more accurate because computers don’t suffer from human mistakes RE2 The technology that the university uses to run online exams is reliable RE3 Online exams don’t favor students with good IT skills because they are done by basic steps RE4 There is no chance of getting easier e-exams because questions are sorted by a test bank according to their difficulty Validity VA1 Online assessment is appropriate for my subject area VA2 The question types of e-exams are of various forms (matching, gap filling, paragraph writing) VA3 E-assessments test not only students’ subject knowledge but also computer skills VA4 University students must have sufficient digital abilities to deal with computer tests Practicality PR1 Online assessments help save a certain amount of paper PR2 Technical problems can be forecasted and solved by the school PR3 It’s an opportunity for students to interact and feel confident with technology PR4 E-assessment is more accessible as learners can take the test in their own comfortable places Security SE1 The scores of online assessments are securely saved SE2 Teachers have sensible methods to control cheating (i.e setting limited time, filming the whole process) SE3 The online exam system is well protected from hackers SE4 The information of account and password is well secured Teaching and learning TL1 The potential for immediate feedback could help students learn TL2 The scores are displayed as soon as the tests are completed, minimizing worrisome wait TL3 Online assessment goes hand-in-hand with e-learning TL4 Online assessments benefit students’ experience with international standardized tests Attitude ATT1 It is desirable for me to undertake e-assessment ATT2 I think it is good for me to get accustomed to e-assessment ATT3 Overall, my attitude towards e-assessment is favorable 3.2. Quantitative: Instrument design After designing a hypothesized model and refining scales that suit the Vietnamese context, all items would be measured using a five-point Likert scale ranging from 1=strongly disagree to 5=strongly agree. Following this, the sample size was then calculated to prepare for the next part of formal data collection. Based on the formulas of [19], [20], the minimum sample size of 140 students is qualified to run both the EFA and model regression analysis for the results of this 28-item questionnaire. Aiming at comparing the attitudes of IT-intensive and non-IT-intensive students, 200 questionnaires were distributed to the Information Technology Faculty and another 200 to random students who do not belong to the IT Faculty. After the removal of faulty surveys due to either incompletion and/or trap questions, a total data from 151 and 161 surveys were collected from IT-majored and non-IT-majored groups respectively and was then kept in a strictly secured server before going through the data analysis process. 3.3. Measure assessment Using the SPSS Version 25, model fitness was assessed by the two tests of reliability and EFA. As in the former test, the two requirements of Cronbach’s alpha being within the range of 0.5<α<0.95 and Corrected item-total correlations being higher than 0.3 were applied [21]. The item AF5, whose item-total correlation was-0.374, was dismissed as it failed to meet the second requirement. After the removal, the test was re-performed, and all figures were within their threshold levels. The set of data is then entering EFA tests whose primary purpose is to examine the convergence validity of the scales. The KMO value of all proposed
  • 5.  ISSN: 2252-8822 Int J Eval & Res Educ, Vol. 12, No. 2, June 2023: 790-797 794 items was 0.827, within the recommended range of 0.5-1.0 [22]. Next comes the Bartlett’s Test of Sphericity. The p value=0.000<0.05, suggesting a high correlation between variables [23]. With the requirement of being higher than 50%, the total variance explained which is of 68.277 is acceptable [24]. As for Rotated component Matrix, the factor loadings of both independent and dependent variables are higher than 0.5. Furthermore, all constructs are also well-grouped into five clusters with no item leaving its initial independent group. Thus, those re-constructed factors are eligible for entering the regression analysis. 3.4. Hypothesis testing In the next stage of data analysis, the model fitness and validity of hypotheses are tested through multiple regression analysis. There are also several criteria examined before the application of the standardized regression equation. As for IT-intensive group, the Durbin-Watson statistic of 2.157, which is around 2, proved that this regression model does not suffer from the first-order autocorrelation [21]. Next, the p value of the F-test at 0.000<0.05 indicated that the independent variables in the model have a linear correlation with the dependent variable. Lastly, the VIF values of 1.024 to 2.418 are all lower than 3, indicating that the regression model does not violate the phenomenon of multicollinearity [21]. Referring to non-IT-intensive group, the Durbin-Watson statistic was recorded at 2.157, the p value was 0.000, and VIF values slightly fluctuate between 1.164 and 1.615. Overall, these very last coefficients also met the requirements of multivariate regression analysis. Through multiple regression analysis, the extent of the six proposed factors impacting students’ attitudes in IT-majored group and non-IT group is presented in Table 2 and Table 3, respectively. Table 2. Coefficients of IT-intensive group Model Unstandardized coefficients Standardized coefficients t Sig. Collinearity statistics B Std. Error Beta Tolerance VIF 1 (Constant) 0.324 0.446 0.725 0.469 AF 0.247 0.068 0.252 3.615 0.000 0.599 1.670 VA 0.433 0.102 0.358 4.260 0.000 0.414 2.418 PR 0.092 0.078 0.081 1.173 0.243 0.616 1.623 RE 0.010 0.063 0.010 0.154 0.878 0.633 1.579 SE 0.331 0.093 0.269 3.550 0.001 0.510 1.960 TL 0.056 0.053 0.057 -1.046 0.297 0.977 1.024 Table 3. Coefficients of non-IT-intensive group Model Unstandardized coefficients Standardized coefficients t Sig. Collinearity statistics B Std. Error Beta Tolerance VIF 1 (Constant) 1.084 0.649 1.669 0.097 AF 0.509 0.080 0.486 6.373 0.000 0.619 1.615 VA 0.139 0.096 0.109 1.452 0.149 0.640 1.564 PR 0.213 0.119 0.122 1.794 0.075 0.779 1.283 RE 0.071 0.080 0.062 0.890 0.375 0.748 1.338 SE 0.224 0.083 0.186 2.709 0.008 0.762 1.312 TL 0.018 0.082 0.014 0.221 0.825 0.859 1.164 4. RESULTS AND DISCUSSION Utilizing a scenario-based survey with 312 student volunteers and multivariate regression analysis, all hypotheses and 28 items have been proved to work well except for AF5, which was intended to use as a research trap question. Overall, all five proposed components of SPEAQ positively affect students' satisfaction. This research outcome is in line with the results found by [17], [25], [26]. From standardized coefficient Beta (β), the effects of each identified determinant towards students of two research groups are interpreted in subsection. 4.1. Affective factors Affective factors were found to significantly affect both two research groups. Thus, we confirmed the finding of [27], [28], which elucidated that the perception of hedonic attributes is an excellent catalyst for academic efforts, attention, engaged emotions, and engaged behaviors of learners. Researching the context of Vietnamese secondary schools and high schools, the result of Tang, Nguyen, and Tran [29] further concluded that affective factors not only directly affected learners’ attitudes, but also indirectly impacted their plans, intentions and commitments to a planned behavior in the future. Through our empirical finding, we confirm that this explanation also applies to the higher educational context of Vietnam.
  • 6. Int J Eval & Res Educ ISSN: 2252-8822  Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran) 795 4.2. Validity Validity is found to significantly affect the IT group, but moderately affect the other. As our work is the first research following this trend, we could not find any previous support and hence, had to rely on our experience of university students’ attitudes. It is self-explanatory that IT students would consider e-assessments tests more valid to their subject area as all their tests throughout the course are computer- based. This is why they also have higher expectations of digital literacy in the collegiate environment. Meanwhile, non-IT students usually have fewer interactions with computers during their study; thus, may not highly appreciate the validity of e-assessments. 4.3. Practicality Contrasting to validity, practicality is regarded as a modest determinant of attitudes in the IT group, but a profound one in the non-IT group. Previous evidence is also unavailable to support our outcome as they did not follow this comparison method. However, investigating high school students in general, the study of [30], [31] both affirmed that practicality of e-assessment was a significant element in students’ perspective for the reason that with less confidence in their own digital skills, non-IT students tend to concern more about instant supports of responsible staff to fix technical problems and higher chances to interact with computers. Therefore, this might explain why practicality turned out to be more critical for the attitudes of non-IT students than their counterparts in this study. 4.4. Reliability Reliability was reported to play an insignificant positive role in the attitudes of both groups. This outcome is opposite to that of [15], [16]. Based on our insights into Vietnamese students, we suppose there were three main reasons endorsing our findings. Firstly, since students virtually took asynchronous tests during our research time, they could actively take control of their own computer quality. Moreover, students were always provided sufficient time to check the results and discuss with lecturers for any queries related to the tests. Lastly, as most of the online tests contained a huge number of short-answer questions, it is hard for students to either remember the test contents or compare the difficulties of their own tests with their peers. In the short run, more in-depth interviews on other universities are encouraged to carry out to examine these elucidations. 4.5. Security Security is determined to have a profound effect on the attitudes of students in both groups. This outcome is in line with the finding of [16], [27]. In fact, previous studies have confirmed that when receiving feelings to security notifications, especially when being asked to react, most users tended to reject online learning system. Security in online learning is closely connected to authentication, encryption, access control, managing users and their permissions to ensure secure use and access [32]. Through our empirical finding, we also confirm that this explanation applies to the higher educational context of Vietnam. 4.6. Teaching and learning Although the teaching-learning dimension played the least important role in the present study, its effect was rated highly significant [15], [16]. This contrast might be explained by the fact that the teaching- learning activities in Vietnam have long been associating with paper-based forms. Therefore, an abrupt shift to e-assessments due to COVID-19 may fail to provide students sufficient time to experience its benefits on learners’ outcomes. Based on the Independent sample T-test results, our study concluded that students at university have positive attitudes towards e-exams, and discipline does not cause dramatic significance in learners' mindset. Clusters such as education and software engineering were more in favor of e-exams while others such as chemistry, mathematics and biology were more reluctant [25]. This might be explained in the way that this students in this private university have been well-equipped with educational technology in their daily study, so most of them, regardless of majors, all feel comfortable and familiar with e-exams. 5. CONCLUSION With all empirical findings, this study has enriched research literature regarding tech-based assessment from the view of test-takers, the amount of which is still of scarcity in Vietnam. Through the results focus group discussion, pilot tests and formal survey distribution, this study puts forward helpful suggestions to better refine the student perceptions of E-assessment Questionnaire, enhancing its applicability and sustainability for wide-spread implementation. Moreover, an independent variable of attitude was added to not only identify students’ attitudes by calculating means, but also highlight the most influential factors affecting attitudes via regression analysis, providing more solid evidence for future policy construction.
  • 7.  ISSN: 2252-8822 Int J Eval & Res Educ, Vol. 12, No. 2, June 2023: 790-797 796 As for managerial contributions, our findings support that major does not cause great differences in students’ attitudes, the same guideline, with a strong emphasis on security and affective factors, could be applied to all departments at school. We recommend Nguyen Tat Thanh University in particular and higher institutions to construct technology-embedded academic landscapes which can maximize students’ digital familiarity and proficiency. Furthermore, it is necessary for higher institutions to come up with effective maintenance policies so that students will less likely face technical problems and discouragement. Regarding security, it is reasonable for students to strongly demand that their account information and especially scores have to be kept securely safe as schools’ systems are often hacked for the purpose of tampering scores or selling personal information. Educational institutions should invest budget in upgrading their information security management system so that any errors or attacks to program code could be warned timely. About the issue of cheating, besides common methods such as appointing different students with different tests or setting time limit, the use of proctored tests for institutions is highlighted. After requiring students to log in by their school-provided account, proctored tests will confirm students’ identities by smart webcams, monitoring and flagging whenever there are signs of suspicious activities. All suggestions above are thought to be of significance to further enhance the school’s capability to deliver e-assessment. While providing some valuable findings, this study does encounter some limitations. Firstly, the sample size of non-IT groups remains limited compared to the number of 20,000 students at Nguyen Tat Thanh University. Therefore, it is suggested that future research could replicate our model (with or without adaptation) to further investigate each department and come up with more generalized outcomes. Furthermore, longitudinal studies, together with intensive qualitative designs might provide us more useful insights into students’ attitudes, and the reasons behind their decisions. ACKNOWLEDGEMENTS Author thanks the Potential Academic Staff (PAS) Grant, Provide by Research Management Centre Research, UTM (PY/2017/00149) for support and funding for this research. REFERENCES [1] T. Việt, “High quality education for all should be Vietnam’s priority,” The World Bank, Apr. 2012. [Online]. Available: https://www.worldbank.org/en/news/feature/2012/04/25/high-quality-education-for-all-should-be-vietnams-priority (accessed Jan. 21, 2022). [2] A. Kiet, “Vietnam spends 5.8% of GDP on education,” Hanoi Times, Jan. 2019. [Online]. Available: https://hanoitimes.vn/vietnam-spends-58-of-gdp-on-education-1781.html (accessed Jan. 21, 2022). [3] T. Huynh, “Concerns about computer-based national high school exams,” (in Vietnamese), Tuoitre News, 2019. [Online]. Available: https://tuoitre.vn/nhieu-ban-khoan-khi-thi-thpt-quoc-gia-tren-may-tinh-20190926200012856 (accessed Jan. 21, 2022). [4] R. Fuller, V. Joynes, J. Cooper, K. Boursicot, and T. Roberts, “Could COVID-19 be our ‘There is no alternative’ (TINA) opportunity to enhance assessment?” Medical Teacher, vol. 42, no. 7, pp. 781–786, Jul. 2020, doi: 10.1080/0142159X.2020.1779206. [5] S. I. Malik and J. Coldwell-Neilson, “Gender differences in an introductory programming course: New teaching approach, students’ learning outcomes, and perceptions,” Education and Information Technologies, vol. 23, no. 6, pp. 2453–2475, Nov. 2018, doi: 10.1007/s10639-018-9725-3. [6] A. Zabadi, “Investigate students attitudes toward computer based Test (CBT) at chemistry course,” Archives of Business Research, vol. 4, no. 6, Dec. 2016, doi: 10.14738/abr.46.2325. [7] B. Z. Waziri, F. A. Tahir, and M. B. Sani “Students’ perception of computer based approach to examining undergraduate accounting courses in the University of Maiduguri, Nigeria,” European Journal of Business and Management, vol. 11, no. 8, Mar. 2019, doi: 10.7176/ejbm/11-8-03. [8] E. Pools and C. Monseur, “Student test-taking effort in low-stakes assessments: evidence from the English version of the PISA 2015 science test,” Large-Scale Assessments in Education, vol. 9, no. 1, p. 10, Dec. 2021, doi: 10.1186/s40536-021-00104-6. [9] M. Kehal, “Assurance of learning and accreditations in business schools: an AACSB perspective,” Journal of Economic and Administrative Sciences, vol. 36, no. 1, pp. 82–96, Oct. 2019, doi: 10.1108/jeas-06-2018-0066. [10] N. Utaberta and B. Hassanpour, “Aligning assessment with learning outcomes,” Procedia - Social and Behavioral Sciences, vol. 60, pp. 228–235, Oct. 2012, doi: 10.1016/j.sbspro.2012.09.372. [11] R. Bashitialshaaer, M. Alhendawi, and Z. Lassoued, “Obstacle comparisons to achieving distance learning and applying electronic exams during COVID-19 pandemic,” Symmetry, vol. 13, no. 1, pp. 1–16, Jan. 2021, doi: 10.3390/sym13010099. [12] F. D. Davis, “Perceived usefulness, perceived ease of use, and user acceptance of information technology,” MIS Quarterly: Management Information Systems, vol. 13, no. 3, pp. 319–339, Sep. 1989, doi: 10.2307/249008. [13] H. Li and J. Yu, “Learners’ continuance participation intention of collaborative group project in virtual learning environment: an extended TAM perspective,” Journal of Data, Information and Management, vol. 2, no. 1, pp. 39–53, Mar. 2020, doi: 10.1007/s42488-019-00017-8. [14] M. Bahar and M. Asil, “Attitude towards e-assessment: influence of gender, computer usage and level of education,” Open Learning, vol. 33, no. 3, pp. 221–237, Sep. 2018, doi: 10.1080/02680513.2018.1503529. [15] J. Dermo, “e-assessment and the student learning experience: A survey of student perceptions of e-assessment,” British Journal of Educational Technology, vol. 40, no. 2, pp. 203–214, Mar. 2009, doi: 10.1111/j.1467-8535.2008.00915.x. [16] K. Rajasekharan Pillai and A. V. Prakash, “Technological leverage in higher education: An evolving pedagogy,” Journal of International Education in Business, vol. 10, no. 2, pp. 130–146, Nov. 2017, doi: 10.1108/JIEB-09-2016-0034.
  • 8. Int J Eval & Res Educ ISSN: 2252-8822  Evaluating factors affecting attitudes of IT-intensive and non-IT-intensive students … (Bao Linh Tran) 797 [17] H. K. Mahmood, F. Hussain, M. Mahmood, R. Kumail, and J. Abbas, “Impact of E-assessment at middle school student’ learning-an empirical study at USA middle school students,” International Journal of Scientific & Engineering Research, vol. 11, no. 4, pp. 1722–1736, 2020. [18] M. Liu and L. Wronski, “Trap questions in online surveys: Results from three web survey experiments,” International Journal of Market Research, vol. 60, no. 1, pp. 32–49, Jan. 2018, doi: 10.1177/1470785317744856. [19] J. F. Hair, R. E. Anderson, R. L. Tatham, and W. C. Black, Multivariate Data Analysis. Upper Saddle River, N.J: Prentice Hall, 1998. [20] S. B. Green, “How many subjects does it take to do a regression analysis?” Multivariate Behavioral Research, vol. 26, no. 3, pp. 499–510, Jul. 1991, doi: 10.1207/s15327906mbr2603_7. [21] J. F. Hair, G.T.M. Hult, C. Ringle, M. Sarstedt, A primer on partial least squares structural equation modeling (PLS-SEM), 2nd ed. Los Angeles: Sage, 2017. [22] N. Ul Hadia, N. Abdullah, and I. Sentosa, “An easy approach to exploratory factor analysis: marketing perspective,” Journal of Educational and Social Research, vol. 6, no. 1, p. 215, Jan. 2016, doi: 10.5901/jesr.2016.v6n1p215. [23] G. Rasch, Probabilistic models for some intelligence and attainment tests. Chicago: MESA Press, 1993. [24] J. F. Hair, Jr., R. E. Anderson, R. L. Tatham, and W. C. Black, Multivariate data analysis with readings, 4th ed. Englewood Cliffs, N.J: Prentice Hall, 1995. [25] M. Hillier, “The very idea of e-Exams: student (pre) conceptions,” in Proceedings of ASCILITE 2014-Annual Conference Australian Society for Computers in Tertiary Education, 2014, pp. 77-88. [26] K. Awad, “Attitude of Ash-Shobak University College Students to E-Exam for Intermediate University Degree in Jordan,” Journal of Education and Practice, vol. 7, no. 9, pp. 10–17, 2016. [27] V. J. Shute, J. P. Leighton, E. E. Jang, and M. W. Chu, “Advances in the science of assessment,” Educational Assessment, vol. 21, no. 1, pp. 34–59, Jan. 2016, doi: 10.1080/10627197.2015.1127752. [28] D. Abun, T. Magallanes, and M. J. Incarnacion, “College students’ cognitive and affective attitude toward higher education and their academic engagement,” International Journal of English Literature and Social Sciences, vol. 4, no. 5, pp. 1494–1507, 2019, doi: 10.22161/ijels.45.38. [29] T. T. Tang, T. N. Nguyen, and H. T. T. Tran, “Vietnamese teachers’ acceptance to use E-assessment tools in teaching: An empirical study using PLS-SEM,” Contemporary Educational Technology, vol. 14, no. 3, p. ep375, May 2022, doi: 10.30935/cedtech/12106. [30] S. S. M. Huda, M. Kabir, and T. Siddiq, “E-assessment in higher education: Students’ perspective,” International Journal of Education and Development using Information and Communication Technology, vol. 16, no. 2, pp. 250–258, 2020. [31] M. Appiah and F. V. Tonder, “E-assessment in higher education: A review,” International Journal of Business Management & Economic Research, vol. 9, no. 6, pp. 1454–1460, 2018. [32] K. Thamadharan and N. Maarop, “The acceptance of E-assessment considering security perspective: Work in progress,” International Journal of Social, Behavioral, Economics, Business and Industrial Engineering, vol. 9, no. 3, pp. 874–879, 2015, doi: 10.5281/zenodo.1099873. BIOGRAPHIES OF AUTHORS Bao Linh Tran is a researcher at the Institute of Interdisciplinary Social Sciences, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam. Bao Linh’s research interests lie in the tech-based learning, educational assessment, innovations in education, student satisfaction. She can be contacted at email: tranbaolinh0108@gmail.com. Phuong Nguyen Lan is PhD of Education Administration, Vice-rector of Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam. She is one of the founding members of Nguyen Tat Thanh University. Besides, PhD. Phuong Lan Nguyen also participates in much interdisciplinary research between economics, management, technology, and the development of scientific research skills. She can be contacted at email: nlphuong@ntt.edu.vn.