https://linen.dev logo
Join Discord
Powered by
# les-ras
  • u

    ⵣAryazⵣ

    02/26/2022, 3:15 AM
    But I was asking just to make sure I am not something obvious (or I am making some fatal errors in my assumptions)
  • u

    ⵣAryazⵣ

    02/26/2022, 3:16 AM
    Anyway
  • u

    ⵣAryazⵣ

    02/26/2022, 3:16 AM
    I have another question about low-Re/High-Re flows
  • u

    ⵣAryazⵣ

    02/26/2022, 3:17 AM
    In turbulence modeling it doesn't refer to the flow Reynolds number
  • u

    ⵣAryazⵣ

    02/26/2022, 3:17 AM
    but to the near wall treatement
  • u

    ⵣAryazⵣ

    02/26/2022, 3:18 AM
    I remember reading about that somewhere where the low-Re/High-Re refer to the local Reynolds number (something to do with the eddy viscosity to molecular viscosity ratio)
  • u

    ⵣAryazⵣ

    02/26/2022, 3:18 AM
    but I can't remember exactly where I read that
  • u

    ⵣAryazⵣ

    02/26/2022, 3:24 AM
    Never mind,
  • u

    ⵣAryazⵣ

    02/26/2022, 3:24 AM
    I have found it in my notes:
  • u

    ⵣAryazⵣ

    02/26/2022, 3:26 AM
  • s

    slopezcastano

    02/28/2022, 8:12 AM
    Yes, but please dont use foam's guide for studying things: that's just a rather "coarse" reference.
  • u

    ⵣAryazⵣ

    03/05/2022, 4:47 AM
    Hello
  • u

    ⵣAryazⵣ

    03/05/2022, 4:48 AM
    I have run several simulations (pimpleFoam) with k omega SST and Spalart Allmaras everything worked very well with 2nd order schemes
  • u

    ⵣAryazⵣ

    03/05/2022, 4:49 AM
    But for K epsilon model, the simulation crashes after a while
  • u

    ⵣAryazⵣ

    03/05/2022, 4:49 AM
    I found that it's because of 2nd order schemes for k and epsilon equations
  • u

    ⵣAryazⵣ

    03/05/2022, 4:51 AM
    Copy code
    //div(phi,k)          Gauss linearUpwind grad; <<<<< CRASH
    //div(phi,epsilon)      Gauss linearUpwind grad; <<<< CRASH
    div(phi,k)          bounded Gauss upwind; <<<<< this works
    div(phi,epsilon)      bounded Gauss upwind; <<< this works
  • u

    ⵣAryazⵣ

    03/05/2022, 4:53 AM
    I have tried to run the simulation with 1st order schemes (Gauss upwind) for a very long time (15 turbine revolution) then switched backed to 2nd order scheme but still experiencing the same crashes
  • u

    ⵣAryazⵣ

    03/05/2022, 4:54 AM
    Result of checkMesh:
  • u

    ⵣAryazⵣ

    03/05/2022, 4:55 AM
    Copy code
    checkMesh
    
    Checking geometry...
        Overall domain bounding box (-2.5 -1 0) (7.5 1 0.01)
        Mesh has 2 geometric (non-empty/wedge) directions (1 1 0)
        Mesh has 2 solution (non-empty) directions (1 1 0)
        All edges aligned with or perpendicular to non-empty directions.
        Boundary openness (2.426147048e-18 -1.308494034e-17 6.596404688e-15) OK.
        Max cell openness = 7.236186345e-16 OK.
        Max aspect ratio = 32.88697808 OK.
        Minimum face area = 1.817370037e-10. Maximum face area = 0.0009034013646.  Face area magnitudes OK.
        Min volume = 1.817370037e-12. Max volume = 9.034013646e-06.  Total volume = 0.1999942193.  Cell volumes OK.
        Mesh non-orthogonality Max: 35.63294714 average: 4.582606041
        Non-orthogonality check OK.
        Face pyramids OK.
        Max skewness = 0.38108429 OK.
        Coupled point location match (average 0) OK.
    
    Mesh OK.
  • u

    ⵣAryazⵣ

    03/05/2022, 4:55 AM
    And this is the last step of pimple iteration when it crashes:
  • u

    ⵣAryazⵣ

    03/05/2022, 4:55 AM
    Copy code
    PIMPLE: iteration 8
    DILUPBiCGStab:  Solving for Ux, Initial residual = 0.7377962214, Final residual = 0.009557805067, No Iterations 1
    DILUPBiCGStab:  Solving for Uy, Initial residual = 0.8103846109, Final residual = 0.007150709936, No Iterations 1
    FDICPCG:  Solving for p, Initial residual = 0.1153543997, Final residual = 0.0009478532323, No Iterations 7
    time step continuity errors : sum local = 4.155521748e+22, global = -2.256536206e+11, cumulative = -4.345601279e+12
    FDICPCG:  Solving for p, Initial residual = 1.044166192e-15, Final residual = 1.627476806e-17, No Iterations 2
    time step continuity errors : sum local = 4.374003442e+30, global = -3.385330183e+14, cumulative = -3.428786196e+14
    FDICPCG:  Solving for p, Initial residual = 9.301201055e-16, Final residual = 1.944253774e-17, No Iterations 2
    time step continuity errors : sum local = 7.853619232e+30, global = -4.643251948e+14, cumulative = -8.072038144e+14
    FDICPCG:  Solving for p, Initial residual = 8.002059964e-16, Final residual = 2.0733594e-17, No Iterations 2
    time step continuity errors : sum local = 9.812030158e+30, global = 6.001452567e+14, cumulative = -2.070585577e+14
    DILUPBiCGStab:  Solving for epsilon, Initial residual = 0.2690014266, Final residual = 6.604571382e-07, No Iterations 1
    DILUPBiCGStab:  Solving for k, Initial residual = 0.9999999999, Final residual = 0.03143761467, No Iterations 1
    bounding k, min: -1.118979582e+21 max: 3.627062884e+77 average: 3.744372658e+72
  • q

    qr

    03/05/2022, 6:47 AM
    Is this parallel compute? I remember reading a long time ago that FDIC doesn't work well with parallel.
  • q

    qr

    03/05/2022, 7:09 AM
    What about solving with a different variant of the standard kEpsilon? eg. realizable kEpsilon, takes additonal information from local flow - in case that stabilizes the solution process. I can only speak from my own experience here: when I monitored k epsilon with 1st and 2nd order accuracy, the Upwind showed a stable, mean flow solution, while linearUpwind solution showed a wavering field around this mean. This was obviously more realistic, but an important takeaway was that 2nd order introduced oscillations. It could be that the standard kEps model is missing some key aspect needed to prevent blow up of these.
  • u

    ⵣAryazⵣ

    03/05/2022, 12:56 PM
    Hi thank you for your reply
  • u

    ⵣAryazⵣ

    03/05/2022, 12:57 PM
    Yes, it a decomposed case with 64 processors on a single node
  • u

    ⵣAryazⵣ

    03/05/2022, 12:57 PM
    I am using RNG k-epsilon model
  • u

    ⵣAryazⵣ

    03/05/2022, 12:58 PM
    I have tried both GAMG and PCG solvers for pressure equation
  • u

    ⵣAryazⵣ

    03/05/2022, 12:58 PM
    but GAMG was very slow compared with FDIC
  • u

    ⵣAryazⵣ

    03/05/2022, 12:58 PM
    In fact with all simulations (komega SST, Spalart-Allmaras, kOmegaSSTLM) the results were great and I have used PCG solver for pressure and PBiCGStab for the rest of equations
  • u

    ⵣAryazⵣ

    03/05/2022, 1:02 PM
    @User: Do you have any Preconditioner recommendation for
    p
    equation instead of
    FDIC
    ?
1...252627...52Latest