<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>模型架构 on 杨の草原</title><link>https://thinkless-github-io.pages.dev/tags/%E6%A8%A1%E5%9E%8B%E6%9E%B6%E6%9E%84/</link><description>Recent content in 模型架构 on 杨の草原</description><generator>Hugo</generator><language>zh-CN</language><lastBuildDate>Tue, 06 May 2025 21:25:21 +0800</lastBuildDate><atom:link href="https://thinkless-github-io.pages.dev/tags/%E6%A8%A1%E5%9E%8B%E6%9E%B6%E6%9E%84/index.xml" rel="self" type="application/rss+xml"/><item><title>经典模型与架构</title><link>https://thinkless-github-io.pages.dev/posts/%E7%BB%8F%E5%85%B8%E6%A8%A1%E5%9E%8B%E4%B8%8E%E6%9E%B6%E6%9E%84/</link><pubDate>Tue, 06 May 2025 21:25:21 +0800</pubDate><guid>https://thinkless-github-io.pages.dev/posts/%E7%BB%8F%E5%85%B8%E6%A8%A1%E5%9E%8B%E4%B8%8E%E6%9E%B6%E6%9E%84/</guid><description>大模型经典架构笔记，记录多模态模型连接方式、视觉编码器选择、文本解码器设计和 DeepSeek 技术细节。</description></item><item><title>Transformer模块</title><link>https://thinkless-github-io.pages.dev/posts/transformer%E6%A8%A1%E5%9D%97/</link><pubDate>Tue, 29 Apr 2025 09:32:21 +0800</pubDate><guid>https://thinkless-github-io.pages.dev/posts/transformer%E6%A8%A1%E5%9D%97/</guid><description>Transformer 架构笔记，记录 Encoder-Decoder 结构、多头自注意力、前馈网络、残差连接和层归一化等基础模块。</description></item></channel></rss>