Skip to content

Commit

Permalink
Merge branch 'master' into v-rusraut/CustomSolnOMSMigration
Browse files Browse the repository at this point in the history
  • Loading branch information
v-rusraut committed Aug 14, 2024
2 parents bc18d0b + 61fb8f3 commit bb1722d
Show file tree
Hide file tree
Showing 184 changed files with 4,257 additions and 3,820 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -246,5 +246,6 @@
"CefAma",
"WindowsFirewallAma",
"1Password",
"RadiflowIsid"
"RadiflowIsid",
"CustomLogsAma"
]
2 changes: 1 addition & 1 deletion DataConnectors/O365 Data/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The Office 365 data connector in Azure Sentinel supports ongoing user and admin

| Content Type | Description | Azure Sentinel Mapping |
| ------------ | ----------- | ---------------------- |
| Audit.AzureActiveDirectory | Azure Active Directory logs that’s relates to Office 365 only | Supported with the default connector for [Office 365](https://docs.microsoft.com/azure/sentinel/connect-office-365) in Azure Sentinel |
| Audit.AzureActiveDirectory | Microsoft Entra ID logs that’s relates to Office 365 only | Supported with the default connector for [Office 365](https://docs.microsoft.com/azure/sentinel/connect-office-365) in Azure Sentinel |
| Audit.Exchange | User and Admin Activities in Exchange Online | Supported with the default connector for [Office 365](https://docs.microsoft.com/azure/sentinel/connect-office-365) in Azure Sentinel |
| Audit.SharePoint | User and Admin Activities in SharePoint Online | Supported with the default connector for [Office 365](https://docs.microsoft.com/azure/sentinel/connect-office-365) in Azure Sentinel |
| Audit.General | Includes all other workloads not included in the previous content types | Not supported with the default connector for Office 365 in Azure Sentinel |
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
## 1.0.0
* Initial release for output plugin for logstash to Microsoft Sentinel. This is done with the Log Analytics DCR based API.
## 1.1.3
* Replace the library rest-client used for connecting with Azure to excon.

## 1.1.1
* Support China and US Government Azure sovereign clouds.

## 1.1.0
* Increase timeout for read/open connections to 120 seconds.
Expand All @@ -9,6 +12,5 @@
* Upgrade version for ingestion api to 2023-01-01.
* Rename the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin.

## 1.1.1
* Support China and US Government Azure sovereign clouds.
* Increase timeout for read/open connections to 240 seconds.
## 1.0.0
* Initial release for output plugin for logstash to Microsoft Sentinel. This is done with the Log Analytics DCR based API.
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ If you do not have a direct internet connection, you can install the plugin to a
Microsoft Sentinel's Logstash output plugin supports the following versions
- 7.0 - 7.17.13
- 8.0 - 8.9
- 8.11 - 8.13
- 8.11 - 8.14

Please note that when using Logstash 8, it is recommended to disable ECS in the pipeline. For more information refer to [Logstash documentation.](<https://www.elastic.co/guide/en/logstash/8.4/ecs-ls.html>)

Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# encoding: utf-8
require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
require 'rest-client'
require 'json'
require 'openssl'
require 'base64'
require 'time'
require 'excon'

module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
class LogAnalyticsAadTokenProvider
Expand Down Expand Up @@ -64,14 +64,13 @@ def post_token_request()
while true
begin
# Post REST request
response = RestClient::Request.execute(method: :post, url: @token_request_uri, payload: @token_request_body, headers: headers,
proxy: @logstashLoganalyticsConfiguration.proxy_aad)

if (response.code == 200 || response.code == 201)
response = Excon.post(@token_request_uri, :body => @token_request_body, :headers => headers, :proxy => @logstashLoganalyticsConfiguration.proxy_aad, expects: [200, 201])

if (response.status == 200 || response.status == 201)
return JSON.parse(response.body)
end
rescue RestClient::ExceptionWithResponse => ewr
@logger.error("Exception while authenticating with AAD API ['#{ewr.response}']")
rescue Excon::Error::HTTPStatus => ex
@logger.error("Error while authenticating with AAD [#{ex.class}: '#{ex.response.status}', Response: '#{ex.response.body}']")
rescue Exception => ex
@logger.trace("Exception while authenticating with AAD API ['#{ex}']")
end
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# encoding: utf-8
require "logstash/sentinel_la/version"
require 'rest-client'
require 'json'
require 'openssl'
require 'base64'
require 'time'
require 'rbconfig'
require 'excon'

module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
class LogAnalyticsClient
Expand All @@ -22,28 +22,78 @@ def initialize(logstashLoganalyticsConfiguration)
@uri = sprintf("%s/dataCollectionRules/%s/streams/%s?api-version=%s",@logstashLoganalyticsConfiguration.data_collection_endpoint, @logstashLoganalyticsConfiguration.dcr_immutable_id, logstashLoganalyticsConfiguration.dcr_stream_name, la_api_version)
@aadTokenProvider=LogAnalyticsAadTokenProvider::new(logstashLoganalyticsConfiguration)
@userAgent = getUserAgent()

# Auto close connection after 60 seconds of inactivity
@connectionAutoClose = {
:last_use => Time.now,
:lock => Mutex.new,
:max_idel_time => 60,
:is_closed => true
}

@timer = Thread.new do
loop do
sleep @connectionAutoClose[:max_idel_time] / 2
if is_connection_stale?
@connectionAutoClose[:lock].synchronize do
if is_connection_stale?
reset_connection
end
end
end
end
end


end # def initialize

# Post the given json to Azure Loganalytics
def post_data(body)
raise ConfigError, 'no json_records' if body.empty?
response = nil

@connectionAutoClose[:lock].synchronize do
#close connection if its stale
if is_connection_stale?
reset_connection
end
if @connectionAutoClose[:is_closed]
open_connection
end

headers = get_header()
# Post REST request
response = @connection.request(method: :post, body: body, headers: headers)
@connectionAutoClose[:is_closed] = false
@connectionAutoClose[:last_use] = Time.now
end
return response

# Create REST request header
headers = get_header()

# Post REST request

return RestClient::Request.execute(method: :post, url: @uri, payload: body, headers: headers,
proxy: @logstashLoganalyticsConfiguration.proxy_endpoint, timeout: 240)
end # def post_data

# Static function to return if the response is OK or else
def self.is_successfully_posted(response)
return (response.code >= 200 && response.code < 300 ) ? true : false
return (response.status >= 200 && response.status < 300 ) ? true : false
end # def self.is_successfully_posted

private

def open_connection
@connection = Excon.new(@uri, :persistent => true, :proxy => @logstashLoganalyticsConfiguration.proxy_endpoint,
expects: [200, 201, 202, 204, 206, 207, 208, 226, 300, 301, 302, 303, 304, 305, 306, 307, 308],
read_timeout: 240, write_timeout: 240, connect_timeout: 240)
@logger.trace("Connection to Azure LogAnalytics was opened.");
end

def reset_connection
@connection.reset
@connectionAutoClose[:is_closed] = true
@logger.trace("Connection to Azure LogAnalytics was closed due to inactivity.");
end

def is_connection_stale?
return Time.now - @connectionAutoClose[:last_use] > @connectionAutoClose[:max_idel_time] && !@connectionAutoClose[:is_closed]
end
# Create a header for the given length
def get_header()
# Getting an authorization token bearer (if the token is expired, the method will post a request to get a new authorization token)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

require "logstash/sentinel_la/logAnalyticsClient"
require "logstash/sentinel_la/logstashLoganalyticsConfiguration"

require "excon"
# LogStashAutoResizeBuffer class setting a resizable buffer which is flushed periodically
# The buffer resize itself according to Azure Loganalytics and configuration limitations
module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
Expand Down Expand Up @@ -59,34 +59,32 @@ def send_message_to_loganalytics(call_payload, amount_of_documents)
return
else
@logger.trace("Rest client response ['#{response}']")
@logger.error("#{api_name} request failed. Error code: #{response.code} #{try_get_info_from_error_response(response)}")
@logger.error("#{api_name} request failed. Error code: #{response.pree} #{try_get_info_from_error_response(response)}")
end
rescue RestClient::Exceptions::Timeout => eto
@logger.trace("Timeout exception ['#{eto.display}'] when posting data to #{api_name}. Rest client response ['#{eto.response.display}']. [amount_of_documents=#{amount_of_documents}]")
@logger.error("Timeout exception while posting data to #{api_name}. [Exception: '#{eto}'] [amount of documents=#{amount_of_documents}]'")
force_retry = true

rescue RestClient::ExceptionWithResponse => ewr
response = ewr.response
@logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{ewr.response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
@logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
rescue Excon::Error::HTTPStatus => ewr
response = ewr.response
@logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
@logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr.class}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")

if ewr.http_code.to_f == 400
@logger.info("Not trying to resend since exception http code is #{ewr.http_code}")
return
elsif ewr.http_code.to_f == 408
force_retry = true
elsif ewr.http_code.to_f == 429
# thrutteling detected, backoff before resending
parsed_retry_after = response.headers.include?(:retry_after) ? response.headers[:retry_after].to_i : 0
seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
if ewr.class == Excon::Error::BadRequest
@logger.info("Not trying to resend since exception http code is 400")
return
elsif ewr.class == Excon::Error::RequestTimeout
force_retry = true
elsif ewr.class == Excon::Error::TooManyRequests
# thrutteling detected, backoff before resending
parsed_retry_after = response.data[:headers].include?('Retry-After') ? response.data[:headers]['Retry-After'].to_i : 0
seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30

#force another retry even if the next iteration of the loop will be after the retransmission_timeout
#force another retry even if the next iteration of the loop will be after the retransmission_timeout
force_retry = true
end
rescue Excon::Error::Socket => ex
@logger.trace("Exception: '#{ex.class.name}]#{ex} in posting data to #{api_name}. [amount_of_documents=#{amount_of_documents}]'")
force_retry = true
end
rescue Exception => ex
@logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
@logger.error("Exception in posting data to #{api_name}. [Exception: '#{ex}, amount of documents=#{amount_of_documents}]'")
rescue Exception => ex
@logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
@logger.error("Exception in posting data to #{api_name}. [Exception: '[#{ex.class.name}]#{ex}, amount of documents=#{amount_of_documents}]'")
end
is_retry = true
@logger.info("Retrying transmission to #{api_name} in #{seconds_to_sleep} seconds.")
Expand All @@ -110,8 +108,8 @@ def transmission_verb(is_retry)
def get_request_id_from_response(response)
output =""
begin
if !response.nil? && response.headers.include?(:x_ms_request_id)
output += response.headers[:x_ms_request_id]
if !response.nil? && response.data[:headers].include?("x-ms-request-id")
output += response.data[:headers]["x-ms-request-id"]
end
rescue Exception => ex
@logger.debug("Error while getting reqeust id from success response headers: #{ex.display}")
Expand All @@ -124,12 +122,13 @@ def try_get_info_from_error_response(response)
begin
output = ""
if !response.nil?
if response.headers.include?(:x_ms_error_code)
output += " [ms-error-code header: #{response.headers[:x_ms_error_code]}]"
if response.data[:headers].include?("x-ms-error-code")
output += " [ms-error-code header: #{response.data[:headers]["x-ms-error-code"]}]"
end
if response.headers.include?(:x_ms_request_id)
output += " [x-ms-request-id header: #{response.headers[:x_ms_request_id]}]"
if response.data[:headers].include?("x-ms-request-id")
output += " [x-ms-request-id header: #{response.data[:headers]["x-ms-request-id"]}]"
end
output += " [response body: #{response.data[:body]}]"
end
return output
rescue Exception => ex
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
module LogStash; module Outputs;
class MicrosoftSentinelOutputInternal
VERSION_INFO = [1, 1, 1].freeze
VERSION_INFO = [1, 1, 3].freeze
VERSION = VERSION_INFO.map(&:to_s).join('.').freeze

def self.version
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ Gem::Specification.new do |s|
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }

# Gem dependencies
s.add_runtime_dependency "rest-client", ">= 2.1.0"
s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
s.add_runtime_dependency "logstash-codec-plain"
s.add_runtime_dependency "excon", ">= 0.88.0"
s.add_development_dependency "logstash-devutils"
end
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ requiredDataConnectors:
- connectorId: ApacheHTTPServer
dataTypes:
- ApacheHTTPServer
- connectorId: CustomLogsAma
datatypes:
- ApacheHTTPServer_CL
queryFrequency: 10m
queryPeriod: 10m
triggerOperator: gt
Expand All @@ -30,5 +33,5 @@ entityMappings:
fieldMappings:
- identifier: Url
columnName: UrlCustomEntity
version: 1.0.1
version: 1.0.2
kind: Scheduled
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ requiredDataConnectors:
- connectorId: ApacheHTTPServer
dataTypes:
- ApacheHTTPServer
- connectorId: CustomLogsAma
datatypes:
- ApacheHTTPServer_CL
queryFrequency: 15m
queryPeriod: 15m
triggerOperator: gt
Expand All @@ -27,5 +30,5 @@ entityMappings:
fieldMappings:
- identifier: Url
columnName: UrlCustomEntity
version: 1.0.1
version: 1.0.2
kind: Scheduled
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ requiredDataConnectors:
- connectorId: ApacheHTTPServer
dataTypes:
- ApacheHTTPServer
- connectorId: CustomLogsAma
datatypes:
- ApacheHTTPServer_CL
queryFrequency: 10m
queryPeriod: 10m
triggerOperator: gt
Expand All @@ -27,5 +30,5 @@ entityMappings:
fieldMappings:
- identifier: Address
columnName: IPCustomEntity
version: 1.0.1
version: 1.0.2
kind: Scheduled
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ requiredDataConnectors:
- connectorId: ApacheHTTPServer
dataTypes:
- ApacheHTTPServer
- connectorId: CustomLogsAma
datatypes:
- ApacheHTTPServer_CL
queryFrequency: 1h
queryPeriod: 1h
triggerOperator: gt
Expand All @@ -29,5 +32,5 @@ entityMappings:
fieldMappings:
- identifier: Address
columnName: IPCustomEntity
version: 1.0.1
version: 1.0.2
kind: Scheduled
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ requiredDataConnectors:
- connectorId: ApacheHTTPServer
dataTypes:
- ApacheHTTPServer
- connectorId: CustomLogsAma
datatypes:
- ApacheHTTPServer_CL
queryFrequency: 1h
queryPeriod: 1h
triggerOperator: gt
Expand All @@ -31,5 +34,5 @@ entityMappings:
fieldMappings:
- identifier: Address
columnName: IPCustomEntity
version: 1.0.1
version: 1.0.2
kind: Scheduled
Loading

0 comments on commit bb1722d

Please sign in to comment.