论文标题

共享经验参与者评论多机构强化学习

Shared Experience Actor-Critic for Multi-Agent Reinforcement Learning

论文作者

Christianos, Filippos, Schäfer, Lukas, Albrecht, Stefano V.

论文摘要

多代理增强学习中的探索是一个具有挑战性的问题,尤其是在稀疏奖励的环境中。我们提出了一种通过共享代理商共享经验来有效探索的通用方法。我们提出的算法称为“共享经验参与者批评”(SEAC),在演员批评框架中分享了经验。我们在稀疏奖励的多区域环境中评估了SEAC,发现它始终以更少的步骤学习并收敛到更高的回报,从而始终优于两个基准和两个最先进的算法。在某些较难的环境中,经验共享可以使学习解决任务而不是学习之间的区别。

Exploration in multi-agent reinforcement learning is a challenging problem, especially in environments with sparse rewards. We propose a general method for efficient exploration by sharing experience amongst agents. Our proposed algorithm, called Shared Experience Actor-Critic (SEAC), applies experience sharing in an actor-critic framework. We evaluate SEAC in a collection of sparse-reward multi-agent environments and find that it consistently outperforms two baselines and two state-of-the-art algorithms by learning in fewer steps and converging to higher returns. In some harder environments, experience sharing makes the difference between learning to solve the task and not learning at all.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源