论文标题
使用卷积神经网络分析AIA耀斑观察
Analysing AIA Flare Observations using Convolutional Neural Networks
论文作者
论文摘要
为了有效地分析太阳空间任务和地面基础工具生成的大量数据,现代机器学习技术,例如决策树,支持矢量机(SVM)和神经网络可能非常有用。在本文中,我们介绍了使用卷积神经网络(CNN)来分析1600A波长中大气成像组件(AIA)的观察结果的初步结果。该数据已预处理以找到在观测值中可见耀斑丝带的燃烧区域。通过确定每个图像是否属于四个类别之一:两米型耀斑,紧凑/圆形的色带耀斑,肢体耀斑或安静的阳光,最终类充当训练或测试集中的任何数据,可以创建和培训CNN,以自动分析耀斑丝带的形状和位置,以自动分析耀斑丝带的形状和位置。创建的网络可以将耀斑色带观测值分类为四个类中的任何一个,最终精度为94%。初始结果表明,大多数图像都正确地分类为紧凑型耀斑类是唯一精确度低于90%AD的类,某些观察结果被错误地分类为属于肢体类。
In order to efficiently analyse the vast amount of data generated by solar space missions and ground-base instruments, modern machine learning techniques such as decision trees, support vector machines (SVMs) and neural networks can be very useful. In this paper we present initial results from using a convolutional neural network (CNN) to analyse observations from the Atmospheric Imaging Assembly (AIA) in the 1600A wavelength. The data is pre-processed to locate flaring regions where flare ribbons are visible in the observations. The CNN is created and trained to automatically analyse the shape and position of the flare ribbons, by identifying whether each image belongs into one of four classes: two-ribbon flare, compact/circular ribbon flare, limb flare or quiet Sun, with the final class acting as a control for any data included in the training or test sets where flaring regions are not present. The network created can classify flare ribbon observations into any of the four classes with a final accuracy of 94%. Initial results show that most of the images are correctly classified with the compact flare class being the only class where accuracy drops below 90% ad some observations are wrongly classified as belonging to the limb class.