In the past few months, community users have added many plug-ins to Apache apisid, enriching the ecosystem of Apache apisid. From the perspective of users, the emergence of more diversified plug-ins is undoubtedly a good thing. On the basis of improving the high performance and low latency of Apache APISIX, they meet users' more expectations for the gateway, namely "one-stop" and "multi-function".
How do community contributors develop plug-ins for Apache apifix? None of the articles on the Apache apisik blog seem to describe the process of developing plug-ins in detail. So this time, let's change our perspective and take a look at the whole process of the birth of a plug-in from the perspective of plug-in developers!
This article records the process of developing file logger plug-in by a front-end engineer without back-end experience. Before describing the implementation process in detail, let's briefly introduce the function of file logger.
Function introduction
File logger supports the use of Apache APIs IX plug-in metadata to generate custom Log formats. Users can attach request and response data in JSON format to Log files through file logger plug-in, or push Log data streams to specified locations.
Imagine that when monitoring the access log of a route, we often pay attention not only to the values of some request and response data, but also want to write the log data separately to the specified file. At this point, you can use the file logger plug-in to help implement these requirements.
Pasting block s outside Docs is not supported
In the specific implementation process, we can write the log data into the specified log file separately through file logger to simplify the process of monitoring and debugging.
Development and implementation process
After introducing the function of file logger, we have more knowledge of this plug-in. Let's explain in detail how I completed the plug-in and added corresponding tests for Apache APIs IX from 0 without server-side experience.
Determine plug-in name and priority
open Apache APIs IX plug-in development guide , the following items need to be determined in order:
- Determine the plug-in classification.
- Determine the plug-in priority and update conf / config default Yaml file.
Because the file logger developed this time belongs to the log type plug-in, I refer to the name and sorting of the existing log plug-ins in Apache APIs IX and put the file logger here:
After consulting the authors of other plug-ins and enthusiastic members of the community, finally confirmed the name of the plug-in file logger and priority 399.
It should be noted that the priority of plug-ins is related to the execution order. The higher the priority value, the higher the execution. The sorting of plug-in names is independent of the execution order.
Create minimum executable plug-in file
After confirming the plug-in name and priority, we can create our plug-in code file in apifix / plugins / directory. Here are two points to note:
-
If the plug-in code file is created directly in the apifix / plugins / directory, there is no need to change the Makefile file.
-
If your plug-in creates its own code directory, you need to update the Makefile file. Please refer to the detailed steps Apache APIs IX plug-in development guide.
-
Now let's create a file logger in the apifix / plugins / directory Lua file.
-
Then, according to the example plugin given by the official as a reference, to complete an initialization version.
-- Introduce the modules we need in the header local core = require("apisix.core") local plugin = require("apisix.plugin") local ngx = ngx -- Declare plug-in name local plugin_name = "file-logger" -- Define plug-ins schema format local schema = { type = "object", properties = { path = { type = "string" }, }, required = {"path"} } -- Plug in metadata schema local metadata_schema = { type = "object", properties = { log_format = log_util.metadata_schema_log_format } } local _M = { version = 0.1, priority = 399, name = plugin_name, schema = schema, metadata_schema = metadata_schema } -- Check whether the plug-in configuration is correct function _M.check_schema(conf, schema_type) if schema_type == core.schema.TYPE_METADATA then return core.schema.check(metadata_schema, conf) end return core.schema.check(schema, conf) end -- Log phase function _M.log(conf, ctx) core.log.warn("conf: ", core.json.encode(conf)) core.log.warn("ctx: ", core.json.encode(ctx, true)) end return _M
After completing the smallest available plug-in file through the example plugin example, you can use core log. Warn (core. JSON. Encode (CONF)) and core log. Warn ("CTX:", core.json.encode (CTX, true)) outputs the configuration data of the plug-in and the data information related to the request to error Log file.
Enable plug-ins and test
Next, create a test route to test whether the plug-in can successfully print the plug-in data and request related data information configured for it to the error log file.
- Prepare a test upstream locally (the test upstream used in this article is 127.0.0.1:3030/api/hello I created locally).
- Create a route through curl command and enable our new plug-in.
curl http://127.0.0.1:9080/apisix/admin/routes/1 -H 'X-API-KEY: edd1c9f034335f136f87ad84b625c8f1' -X PUT -d ' { "plugins": { "file-logger": { "path": "logs/file.log" } }, "upstream": { "type": "roundrobin", "nodes": { "127.0.0.1:3030": 1 } }, "uri": "/api/hello" }'
Then you will see a status code of 200, indicating that the route has been successfully created.
- Run the curl command to send a request to the route to test whether the file logger plug-in has been started.
$ curl -i http://127.0.0.1:9080/api/hello HTTP/1.1 200 OK ... hello, world
- In logs / error There will be such a record in the log file:
You can see that in the conf parameter, the path we configured for the plug-in is: logs / file Log has been saved successfully. So far, we have successfully created a minimum available plug-in and printed the data of conf and ctx parameters in the log phase.
After that, we can directly in the file logger In the Lua plug-in code file, write the core functions for it. Here, we can directly run the APIs IX reload command to reload the latest plug-in code without restarting Apache APIs IX.
Write core functions for the file logger plug-in
The main function of the file logger plug-in is to write log data. After inquiring and consulting materials, I learned that Lua's IO Library: https://www.tutorialspoint.com/lua/lua_file_io.htm Therefore, it is confirmed that the functional logic of the plug-in is roughly as follows:
-
After each request is accepted, the log data is output to the plug-in configuration
path
Go.
- First, in the logging phase, get the value of path in file logger through conf.
- Then, the Lua IO library is used to complete the operations of file creation, opening, writing, refreshing cache and closing.
-
Processing errors such as failed to open the file, failed to create the file, etc.
local function write_file_data(conf, log_message) local msg, err = core.json.encode(log_message) if err then return core.log.error("message json serialization failed, error info : ", err) end local file, err = io_open(conf.path, 'a+') if not file then core.log.error("failed to open file: ", conf.path, ", error info: ", err) else local ok, err = file:write(msg, '\n') if not ok then core.log.error("failed to write file: ", conf.path, ", error info: ", err) else file:flush() end file:close() end end
- Referring to the source code of HTTP logger plug-in, the method of transmitting journal data to log data and some judgment and processing of metadata are completed.
function _M.log(conf, ctx) local metadata = plugin.plugin_metadata(plugin_name) local entry if metadata and metadata.value.log_format and core.table.nkeys(metadata.value.log_format) > 0 then entry = log_util.get_custom_format_log(ctx, metadata.value.log_format) else entry = log_util.get_full_log(ngx, conf) end write_file_data(conf, entry) end
Verification and addition test
Verify collection logging
Because the file logger plug-in has been enabled when creating the test route, and the path is configured as logs / file Log, so we only need to send a request to the test route to verify the results of log collection:
curl -i http://127.0.0.1:9080/api/hello
In the corresponding logs / file Log, we can see that each record is saved in JSON format. After formatting one of the data, it is as follows:
{ "server":{ "hostname":"....", "version":"2.11.0" }, "client_ip":"127.0.0.1", "upstream":"127.0.0.1:3030", "route_id":"1", "start_time":1641285122961, "latency":13.999938964844, "response":{ "status":200, "size":252, "headers":{ "server":"APISIX\/2.11.0", "content-type":"application\/json; charset=utf-8", "date":"Tue, 04 Jan 2022 08:32:02 GMT", "vary":"Accept-Encoding", "content-length":"19", "connection":"close", "etag":"\"13-5j0ZZR0tI549fSRsYxl8c9vAU78\"" } }, "service_id":"", "request":{ "querystring":{ }, "size":87, "method":"GET", "headers":{ "host":"127.0.0.1:9080", "accept":"*\/*", "user-agent":"curl\/7.77.0" }, "url":"http:\/\/127.0.0.1:9080\/api\/hello", "uri":"\/api\/hello" } }
So far, the process of verifying and collecting log records has ended. The verification results show that the plug-in has been started successfully and returned the due data
Add tests for plug-ins
For add_ block_ The code of preprocessor is confused at the beginning of writing because I haven't studied Perl myself. Only after asking can we know the correct way to use it: if we don't write the relevant request assertion and no in the data part_ error_ When asserting, the default log is as follows:
--- request GET /t --- no_error_log [error]
After comprehensively referring to some other log test files, I created file logger in the t/plugin / directory T file.
Each test document is provided by__ DATA__ It is divided into preamble and data part. As there is no clear archiving and classification of test related documents on the official website, please refer to the relevant materials at the end of the article for more specific contents. Below I will list one of the test cases completed after referring to relevant materials:
use t::APISIX 'no_plan'; no_long_string(); no_root_location(); add_block_preprocessor(sub { my ($block) = @_; if (! $block->request) { $block->set_value("request", "GET /t"); } if (! $block->no_error_log && ! $block->error_log) { $block->set_value("no_error_log", "[error]"); } }); run_tests; __DATA__ === TEST 1: sanity --- config location /t { content_by_lua_block { local configs = { -- full configuration { path = "file.log" }, -- property "path" is required { path = nil } } local plugin = require("apisix.plugins.file-logger") for i = 1, #configs do ok, err = plugin.check_schema(configs[i]) if err then ngx.say(err) else ngx.say("done") end end } } --- response_body_like done property "path" is required
So far, the plug-in adding test phase is over.
summary
The above is the whole process of implementing an Apache APIs IX plug-in from 0 as a novice at the back end. In the process of developing plug-ins, I did encounter many pits. Fortunately, there are many enthusiastic leaders in the Apache apisik community to help me solve my doubts, which makes the whole process of file logger plug-in development and testing relatively smooth. If you are interested in this plug-in or want to view the details of the plug-in, you can refer to Apache APIs IX official documentation.
At present, Apache APIs IX is also developing other plug-ins to support the integration of more services. If you are interested, please feel free to GitHub Discussion To initiate a discussion, or through mailing list Exchange and discussion.