0.0
Shipping info lib that provide API integrations with brazilian companies.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
it provides EMS shipping rate calculation for Japan
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
0.0
Log2json lets you read, filter and send logs as JSON objects via Unix pipes.
It is inspired by Logstash, and is meant to be compatible with it at the JSON
event/record level so that it can easily work with Kibana.
Reading logs is done via a shell script(eg, `tail`) running in its own process.
You then configure(see the `syslog2json` or the `nginxlog2json` script for
examples) and run your filters in Ruby using the `Log2Json` module and its
contained helper classes.
`Log2Json` reads from stdin the logs(one log record per line), parses the log
lines into JSON records, and then serializes and writes the records to stdout,
which then can be piped to another process for processing or sending it to
somewhere else.
Currently, Log2json ships with a `tail-log` script that can be run as the input
process. It's the same as using the Linux `tail` utility with the `-v -F`
options except that it also tracks the positions(as the numbers of lines read
from the beginning of the files) in a few files in the file system so that if the
input process is interrupted, it can continue reading from where it left off
next time if the files had been followed. This feature is similar to the sincedb
feature in Logstash's file input.
Note: If you don't need the tracking feature(ie, you are fine with always
tailling from the end of file with `-v -F -n0`), then you can just use the `tail`
utility that comes with your Linux distribution.(Or more specifically, the
`tail` from the GNU coreutils). Other versions of the `tail` utility may also
work, but are not tested. The input protocol expected by Log2json is very
simple and documented in the source code.
** The `tail-log` script uses a patched version of `tail` from the GNU coreutils
package. A binary of the `tail` utility compiled for Ubuntu 12.04 LTS is
included with the Log2json gem. If the binary doesn't work for your
distribution, then you'll need to get GNU coreutils-8.13, apply the patch(it
can be found in the src/ directory of the installed gem), and then replace
the bin/tail binary in the directory of the installed gem with your version
of the binary. **
P.S. If you know of a way to configure and compile ONLY the tail program in
coreutils, please let me know! The reason I'm not building tail post gem
installation is that it takes too long to configure && make because that
actually builds every utilties in coreutils.
For shipping logs to Redis, there's the `lines2redis` script that can be used as
the output process in the pipe. For shipping logs from Redis to ElasticSearch,
Log2json provides a `redis2es` script.
Finally here's an example of Log2json in action:
From a client machine:
tail-log /var/log/{sys,mail}log /var/log/{kern,auth}.log | syslog2json |
queue=jsonlogs \
flush_size=20 \
flush_interval=30 \
lines2redis host.to.redis.server 6379 0 # use redis DB 0
On the Redis server:
redis_queue=jsonlogs redis2es host.to.es.server
Resources that help writing log2json filters:
- look at log2json.rb source and example filters
- http://grokdebug.herokuapp.com/
- http://www.ruby-doc.org/stdlib-1.9.3/libdoc/date/rdoc/DateTime.html#method-i-strftime
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Log2json lets you read, filter and send logs as JSON objects via Unix pipes.
It is inspired by Logstash, and is meant to be compatible with it at the JSON
event/record level so that it can easily work with Kibana.
Reading logs is done via a shell script(eg, `tail`) running in its own process.
You then configure(see the `syslog2json` or the `nginxlog2json` script for
examples) and run your filters in Ruby using the `Log2Json` module and its
contained helper classes.
`Log2Json` reads from stdin the logs(one log record per line), parses the log
lines into JSON records, and then serializes and writes the records to stdout,
which then can be piped to another process for processing or sending it to
somewhere else.
Currently, Log2json ships with a `tail-log` script that can be run as the input
process. It's the same as using the Linux `tail` utility with the `-v -F`
options except that it also tracks the positions(as the numbers of lines read
from the beginning of the files) in a few files in the file system so that if the
input process is interrupted, it can continue reading from where it left off
next time if the files had been followed. This feature is similar to the sincedb
feature in Logstash's file input.
Note: If you don't need the tracking feature(ie, you are fine with always
tailling from the end of file with `-v -F -n0`), then you can just use the `tail`
utility that comes with your Linux distribution.(Or more specifically, the
`tail` from the GNU coreutils). Other versions of the `tail` utility may also
work, but are not tested. The input protocol expected by Log2json is very
simple and documented in the source code.
** The `tail-log` script uses a patched version of `tail` from the GNU coreutils
package. A binary of the `tail` utility compiled for Ubuntu 12.04 LTS is
included with the Log2json gem. If the binary doesn't work for your
distribution, then you'll need to get GNU coreutils-8.13, apply the patch(it
can be found in the src/ directory of the installed gem), and then replace
the bin/tail binary in the directory of the installed gem with your version
of the binary. **
P.S. If you know of a way to configure and compile ONLY the tail program in
coreutils, please let me know! The reason I'm not building tail post gem
installation is that it takes too long to configure && make because that
actually builds every utilties in coreutils.
For shipping logs to Redis, there's the `lines2redis` script that can be used as
the output process in the pipe. For shipping logs from Redis to ElasticSearch,
Log2json provides a `redis2es` script.
Finally here's an example of Log2json in action:
From a client machine:
tail-log /var/log/{sys,mail}log /var/log/{kern,auth}.log | syslog2json |
queue=jsonlogs \
flush_size=20 \
flush_interval=30 \
lines2redis host.to.redis.server 6379 0 # use redis DB 0
On the Redis server:
redis_queue=jsonlogs redis2es host.to.es.server
Resources that help writing log2json filters:
- look at log2json.rb source and example filters
- http://grokdebug.herokuapp.com/
- http://www.ruby-doc.org/stdlib-1.9.3/libdoc/date/rdoc/DateTime.html#method-i-strftime
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Provide shipping rates and tracking for Active Merchant carriers
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Free shipping in Piggybak, depending on a product / sellable method as defined in the configuration.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Choosing multiple carrier shipping options
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Removes shipping options from the product and admin menu and the deliver step from checkout
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
This gem provides a basic shipping calculator for PostNL packages and letters to be send within the Netherlands
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Provides flexible shipping rate configuration
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.0
Shipping estimation plugin for the Workarea Commerce platform.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.05
Easily use FedEx, UPS, USPS web services with an elegant and simple syntax.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.04
Easily use FedEx, UPS, USPS web services with an elegant and simple syntax.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.03
A generic background thread worker for shipping events via https to some API backend.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.02
Allows to calculate shipping costs based on total item weigh and quantity in the order
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
0.02
Create shipments and get rates and tracking info from various shipping carriers.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.01
Client to access packlink.com shipping API
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.01
Allows to generate shipping labels and to perform some of the Endicia basic operations.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
0.01
Allows you to quote or ship a Physical::Shipment object
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity
0.01
Get shipping quotes from DHL
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
Activity