当前位置:科学网首页 > 小柯机器人 >详情
通过尖峰神经元学习网络实现氮还原记忆活性的再充催化剂
作者:小柯机器人 发布时间:2021/4/4 23:26:02

南京大学刘力哲团队通过尖峰神经元学习网络实现氮还原记忆活性的再充催化剂。相关研究成果发表在2021年3月31日出版的《美国化学会杂志》。

从N2到NH3的电催化已经被越来越多的研究,因为它提供了一种环境友好的方法来代替目前的Haber–Bosch方法。不幸的是,N2到NH3的转化远远低于大规模实施的必要水平。

受尖峰神经网络中信号记忆的启发,研究人员通过可控的电刺激来激活和记忆最佳的催化活性开发了可充电催化剂技术。该文中,研究人员设计了双面FeReS3-Janus层,模拟由电阻开关突触组成的多神经元网络,使一系列有趣的多相转变激活未发现的催化活性;激活能屏障通过两个非等效表面之间的活性位点转换而明显降低。电场刺激FeReS3显示出43%的法拉第效率和203μg h–1 mg–1的最高NH3合成速率。

此外,这种可充电催化剂显示出前所未有的催化性能,持续时间长达216小时,可以通过简单的充电操作反复激活。

附:英文原文

Title: Recharged Catalyst with Memristive Nitrogen Reduction Activity through Learning Networks of Spiking Neurons

Author: Gang Zhou, Tinghui Li, Rong Huang, Peifang Wang, Bin Hu, Hao Li, Lizhe Liu, Yan Sun

Issue&Volume: March 31, 2021

Abstract: Electrocatalysis from N2 to NH3 has been increasingly studied because it provides an environmentally friendly avenue to take the place of the current Haber–Bosch method. Unfortunately, the conversion of N2 to NH3 is far below the necessary level for implementation at a large scale. Inspired by signal memory in a spiking neural network, we developed rechargeable catalyst technology to activate and remember the optimal catalytic activity using manageable electrical stimulation. Herein, we designed double-faced FeReS3 Janus layers that mimic a multiple-neuron network consisting of resistive switching synapses, enabling a series of intriguing multiphase transitions to activate undiscovered catalytic activity; the activation energy barrier is clearly reduced via an active site conversion between two nonequivalent surfaces. Electrical field-stimulated FeReS3 demonstrates a Faradaic efficiency of 43% and the highest rate of 203 μg h–1 mg–1 toward NH3 synthesis. Moreover, this rechargeable catalyst displays unprecedented catalytic performance that persists for up to 216 h and can be repeatedly activated through a simple charging operation.

DOI: 10.1021/jacs.0c12458

Source: https://pubs.acs.org/doi/10.1021/jacs.0c12458

 

期刊信息

JACS:《美国化学会志》,创刊于1879年。隶属于美国化学会,最新IF:14.612
官方网址:https://pubs.acs.org/journal/jacsat
投稿链接:https://acsparagonplus.acs.org/psweb/loginForm?code=1000