Windows在openclaw.json中配置本地ollama模型的参考

2T超大容量网盘!点击领取 >> 原画质高清备份,上传下载不限速

本地openclaw调用本地ollama在不同的软件工具配配置方式是不一样的,我发现利用1panel进行docker的openclaw安装,里面的界面配置本地ollama总是会添加一些自动添加的内容,导致配置后报错。以下是Windows中api的配置内容,留作参考。

ollama默认不需要apikey,所以可以赋值any-value

{
  "agents": {
    "defaults": {
      "workspace": "C:\\Users\\Administrator\\.openclaw\\workspace",
      "model": {
        "primary": "ollama/gemma4:31b" 
      },
      "models": {
        "ollama/gemma4:31b": {}
      }
    }
  },
  "gateway": {
    "mode": "local",
    "port": 18789,
    "auth": {
      "mode": "token",
      "token": "a86744d80f61f9fsasd9fsddf6d22fsf69fsdf09ab0"
    }
  },
  "models": {
    "providers": {
      "ollama": {
        "baseUrl": "http://192.168.100.254:11434",
        "apiKey": "any-value",
        "api": "openai-responses",
        "models": []
      }
    }
  },
  "meta": {
    "lastTouchedVersion": "2026.4.26",
    "lastTouchedAt": "2026-04-29T04:15:30.719Z"
  },
  "wizard": {
    "lastRunAt": "2026-04-29T03:55:39.940Z",
    "lastRunVersion": "2026.4.26",
    "lastRunCommand": "doctor",
    "lastRunMode": "local"
  },
  "plugins": {
    "entries": {
      "ollama": {
        "enabled": true
      }
    }
  }
}

 

 

2T超大容量网盘!点击领取 >> 原画质高清备份,上传下载不限速
这是一个持续运营的鼓励
如果真的对你有用的话,感谢支持服务器及作者运营
THE END