5  CONCLUSIONS 
Students  cannot  solve  real-life  problems  correctly 
because they do not know how to integrate multiple 
disciplines.  STEM  education  can  increase  students' 
problem- solving skills by applying integrated multi- 
discipline concepts. This skill could be perceived by 
using  an  assessment  that  are  provided  by  a  rubric. 
Therefore, the teacher needs to have a valid rubric to 
improve  students' real-life  problem-solving skills  in 
STEM education. Thus, the teacher can advance their 
learning methodology to enhance students' problem-
solving skills. 
A rubric should contain four components, such as 
identifying the  problem,  creating  a  plan  of  solution 
for solving the problem, evaluating the solution, and 
communicating  the  solution.  The  steps  of  rubric 
validation are a revision by more than two experts in 
the  same  and  different fields  and  testing  the  rubric. 
Teachers can use the rubric after the rubric is proved 
that it is valid and reliable to analyze students' real-
world problem-solving skills in STEM education. 
In  contrast, there  is  a  lack  of  information  about 
how significant the impact of using a valid rubric in 
STEM  education  to  improve  students'  problem- 
solving. Therefore, there could be future research to 
measure  the  improvement  of  students'  real-  life 
problem-solving  skills in STEM education. It could 
be  research  about  the  students'  interest  in  STEM 
education based on the results of the assessment by 
using a valid rubric. The teacher can analyze whether 
students can solve the real-world problem step by step 
based on  the correct rubric. After that, teachers can 
give  feedback  to  the  students;  then,  students  will 
know their ability and increase their skills in problem- 
solving.  Without  the  rubric,  the  teacher  could  not 
analyze  students'  problem-solving  skills  accurately 
because  there  is  no  indicator  of  problem-solving 
skills. 
REFERENCES 
Arffman,  I.  (2015).  Threats  to  validity  when  using  open- 
ended  items  in  international  achievement  studies: 
Coding  responses  to  the  PISA  2012  problem-solving 
test  in  Finland.  Scandinavian  Journal  of  Educational 
Research,  60(6),  609–625.  doi: 
10.1080/00313831.2015.1066429 
Basu, S., Kinnebrew, J. S., Shekhar, S., Caglar, F., Rafi, T. 
H.,  Biswas,  G.,  &  Gokhale,  A.  (2015).  Collaborative 
Problem Solving using a Cloud-based Infrastructure to 
Support High School STEM Education. Proceedings of 
the  ASEE  Annual  Conference  &  Exposition,  1–21. 
Retrieved  from  https://search- 
ebscohostcom.libproxy.lib.ilstu.edu/login.aspx?direct=
true&db  =a9  h&AN=116025167&site=ehost-
live&scope=site 
Bell, D. (2016). The reality of STEM education, design, and 
technology teachers' perceptions: A Phenomenographic 
study.  International  Journal  of  Technology  &  Design 
Education,  26(1),  61–79.  https://doi- 
org.libproxy.lib.ilstu.edu/10.1007/s10798-015-9300-9 
Dahm,  K.  (2014).  Combining  the  Tasks  of  Grading 
Individual  Assignments  and  Assessing  Student 
Outcomes in Project-Based Courses. Journal of STEM 
Education:  Innovations  &  Research,  15(1),  20–31. 
Retrieved  from  https://search- 
ebscohostcom.libproxy.lib.ilstu.edu/login.aspx?direct=
true&db=a9h&AN=96381826&site=edslive&scope=si
te 
Docktor,  J.  L.,  Dornfeld,  J.,  Frodermann,  E.,  Heller,  K., 
Hsu, L., Jackson, K. A., … Yang, J. (2016). Assessing 
student written problem  solutions: A  problem-solving 
rubric  with  application  to  introductory  physics. 
Physical  Review  Physics  Education  Research,  12(1). 
doi: 10.1103/physrevphyseducres.12.010130 
Ejiwale,  J.  A.  (2013).  Barriers  to  Successful 
Implementation  of  STEM  Education.  Journal  of 
Education  and  Learning  (EduLearn),  7(2),  63.  doi: 
10.11591/edulearn. v7i2.220 
English, L. D. (2016). STEM education K-12: perspectives 
on  integration.  International  Journal  of  STEM 
Education, 3(1). DOI: 10.1186/s40594-016-0036-1 
Eseryel,  D.,  Ifenthaler,  D.,  &  Ge,  X.  (2013).  Validation 
study of a method for assessing complex ill-structured 
problem  solving  by  using  causal  representations. 
Educational  Technology  Research  and  Development, 
61(3), 443–463. doi: 10.1007/s11423-013-9297-2 
Euefueno, W.  D.  (2019). Project-/problem-based learning 
in STEM: impacts on student learning. Technology & 
Engineering  Teacher,  78(8),  8–12.  Retrieved  from 
https://search-ebscohost- 
com.libproxy.lib.ilstu.edu/login.aspx?direct=true&db=
iih &AN=136066910&site=eds-live&scope=site 
Ferreira,  M,  and  Trudel  A.  R.  (2012).  The  Impact  of 
Problem-Based  Learning  (PBL)  on  Student  Attitudes 
Toward Science, Problem-Solving Skills, and Sense of 
Community  in  the  Classroom.  Journal  of  Classroom 
Interaction,  47(1),  23–30.  Retrieved  from 
https://search- 
ebscohostcom.libproxy.lib.ilstu.edu/login.aspx?direct=
true&db=e  ft&AN=83525495&site=eds-
live&scope=site 
Ge,  X.,  Planas,  L.  G.,  &  Eseryel,  D.  (2011).  Developing 
Valid  Assessment  Methods  and  Scoring  Rubrics  to 
Measure  Ill-structured  Problem-Solving  Performance. 
American Educational Research Association. 
Gray,  J.  S.,  Brown,  M.  A.,  &  Connolly,  J.  P.  (2017). 
Examining  Construct  Validity  of  the  Quantitative 
Literacy  VALUE  Rubric  in  College-Level  STEM 
Assignments. Research & Practice in Assessment, 12, 
20–31.  Retrieved  from  https://search-ebscohost- 
com.libproxy.lib.ilstu.edu/login.aspx?direct=true&db=
e ric&AN=EJ1149590&site=eds-live&scope=site