Nuclear fusion aims at producing on Earth the energy of the stars, by confining the fuel (called plasma). However, a fusion plasma is a complex system, characterised by instabilities developing on disparate spatio-temporal scales which, in nonlinear regimes, can lead to turbulent transport. It is well-known that turbulence can limit the performance of fusion devices. Numerical simulations are essential to support future experiments on ITER. One of the most efficient codes for core plasma turbulence simulations is GYSELA, developed for 15 years at IRFM/CEA and evolving nowadays towards the edge-core turbulence coupling in the presence of kinetic electrons, which requires the use of exascale computational power. To achieve this class of simulations, an efficient use of computing resources and of the generated data is mandatory, which motivates the proposed PhD thesis. The first goal is to develop Artificial Intelligence (AI) techniques based on neural networks to detect loss of exactness regimes in the numerical resolution, solution behavior that has been identified as problematic, therefore anticipating numerical crashes or unreliable simulations. The second goal is to rationalize data-saving using pattern-recognition to detect relevant events and neural networks to infer missing data. Of particular interest in the use of AI techniques is to identify numerical sequences that strongly involve the boundary conditions, such as those driving particle losses, namely the transfer of charge and energy into the immersed boundaries. It is then crucial to identify spurious numerical losses from those sustained by appropriate solutions and optimize resolutions properties and numerical penalization schemes favoring the latter.