Hello comunity,
my overall goal is to simulate a cooling battery.
To ilustrate my problem I simplify the model to a two part model. The model shown in the picture consists of two parts (an aluminum profile, which induces heat) and a water pipe which should take the heat.
My values set for the simulation are:
-Heat source: 100.000 W/m^3 (constant)
-constant inlet velocity: 0,1m/s in normal direction with inlet temperature (298K)
-outlet: default conditions
-Wall (see picture two) are the 4 walls connecting the aluminum and the water. you see the thermal boundary condition in picture two as well (flux, heat flux 0, convective heat coefficient=2300 and heat reference temperature 298K. the alpha=2300 is a value from the internet for a water flow inside a pipe.
-default Wall (in picture 2 the brown surfaces): boundary conditions: flux, heat flux 0, convective heat coefficient 0, convective heat reference temperature 273.16K
In picutre 3 "results" you can see the temperature profile. the results kind of make sense to me, but the actual values are not matching my analytical calculation.
Using the formula Q=mass flow*c_p_water*DeltaT the output temperature should be 298.478K if Q=75.000W (Heat source W/m^3*Volume 0.75 m^3), c_p_water=4183J/kg/K and T_in=298K. The average output temperature in the simulation is 298.0477K so 1/10 of the analytical result. Same thing for my original battery simulation. The output temperature analytically calculated should be 302.5K, the result is 300K in regard to inlet temperature of 298K. So Delta T in my orignal simulation is about a factor of 2.5 away from each other.
It seems to me, that not all the heat is absorbed by the water, but dissapears anywhere. But for what I understand I have set all heat transfer at the default wall to zero, so only the water can take the heat.
I hope my problem is understandable and there is any help for me. I really have no solution as a new user for hyperworks cfd.
Kind regards
Daniel